About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

STAC Points to Everest Boost

Subscribe to our newsletter

Via a report sponsored by data feed handler specialists SR Labs, the benchmarkers at STAC have just announced data for initial tests run on Intel’s recently-introduced Everest chip. Compared to Intel’s standard Westmere chip, one data point suggests a 22% reduction in mean latency.

Everest – or Intel’s Xeon X5698 – is a dual core chip, with each core running at 4.4 Ghz, compared to the X5687 (aka Westmere), with four cores at 3.6 GHz. Intel describes Everest as an “off roadmap” chip designed for “very specific, niche high performance computing applications” while still “running within warranty covered norms, specifications and safe thermal envelope.”

The tests were run using SR Labs’ MIPS (Market Data In Process System) feed handling software. While multi-core chips are often leveraged to boost application performance, some applications are inherently single-threaded, and so benefit more from increased speed of each core. Market data feed handlers and exchange matching engines are two such applications.

For the geeks, the two “stacks under test” comprised:

– SR Labs MIPS In-Process Market Data Line Handler for TVITCH 4.1 
– CentOS 5.5, 64-bit Linux 
– IBM x3650 Server 
– Myricom 10G-PCIE2-8B2-2S Network Interface 
– Processor: 
SUT A: 2 x quad core Intel Xeon 5687 3.60 GHz (“Westmere”) 
SUT B: 2 x dual core Intel Xeon 5698 4.40 GHz (“Everest”)

The test harness for this project incorporated TS-Associates’ TipOff and Simena F16 Fiber Optic Tap for wire-based observation, along with TS-Associates’ Application Tap cards for precise in-process observation. A Symmetricom SyncServer S350 was the time source for the harness.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

Past, Present, and Future of AI and Machine Learning in Trading and Investment Management

On this episode of FinTech Focus TV recorded at A-Team Group’s Buy AND Build Summit, Toby Babb of Harrington Starr sits down with David Marcos, Founder and Managing Partner at Quantoro Technologies, to explore how AI agents are redefining trading, portfolio management, and the investor experience. From simplifying complex investment strategies to the rise of...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Enterprise Data Management, 2010 Edition

The global regulatory community has become increasingly aware of the data management challenge within financial institutions, as it struggles with its own challenge of better tracking systemic risk across financial markets. The US regulator in particular is seemingly keen to kick off a standardisation process and also wants the regulatory community to begin collecting additional...