About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

STAC Points to Everest Boost

Subscribe to our newsletter

Via a report sponsored by data feed handler specialists SR Labs, the benchmarkers at STAC have just announced data for initial tests run on Intel’s recently-introduced Everest chip. Compared to Intel’s standard Westmere chip, one data point suggests a 22% reduction in mean latency.

Everest – or Intel’s Xeon X5698 – is a dual core chip, with each core running at 4.4 Ghz, compared to the X5687 (aka Westmere), with four cores at 3.6 GHz. Intel describes Everest as an “off roadmap” chip designed for “very specific, niche high performance computing applications” while still “running within warranty covered norms, specifications and safe thermal envelope.”

The tests were run using SR Labs’ MIPS (Market Data In Process System) feed handling software. While multi-core chips are often leveraged to boost application performance, some applications are inherently single-threaded, and so benefit more from increased speed of each core. Market data feed handlers and exchange matching engines are two such applications.

For the geeks, the two “stacks under test” comprised:

– SR Labs MIPS In-Process Market Data Line Handler for TVITCH 4.1 
– CentOS 5.5, 64-bit Linux 
– IBM x3650 Server 
– Myricom 10G-PCIE2-8B2-2S Network Interface 
– Processor: 
SUT A: 2 x quad core Intel Xeon 5687 3.60 GHz (“Westmere”) 
SUT B: 2 x dual core Intel Xeon 5698 4.40 GHz (“Everest”)

The test harness for this project incorporated TS-Associates’ TipOff and Simena F16 Fiber Optic Tap for wire-based observation, along with TS-Associates’ Application Tap cards for precise in-process observation. A Symmetricom SyncServer S350 was the time source for the harness.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

From Noise to Signal: How AI is Revolutionising Data Discovery for Traders and Investment Managers

The financial markets have never suffered from a lack of data. If anything, the challenge for modern traders and investment managers is quite the opposite: they are drowning in it. From real-time pricing and news feeds to unstructured earnings call transcripts and social media sentiment, the volume of information is immense. The critical differentiator in...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook – Fourth Edition

Need to know all the essentials about the regulations impacting data management? Welcome to the Fourth edition of our A-Team Regulatory Data Handbook which provides all the essentials about regulations impacting data management. A-Team’s series of Regulatory Data Handbooks are a great way to see at-a-glance: All the regulations that are impacting data management today A...