About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

STAC Points to Everest Boost

Subscribe to our newsletter

Via a report sponsored by data feed handler specialists SR Labs, the benchmarkers at STAC have just announced data for initial tests run on Intel’s recently-introduced Everest chip. Compared to Intel’s standard Westmere chip, one data point suggests a 22% reduction in mean latency.

Everest – or Intel’s Xeon X5698 – is a dual core chip, with each core running at 4.4 Ghz, compared to the X5687 (aka Westmere), with four cores at 3.6 GHz. Intel describes Everest as an “off roadmap” chip designed for “very specific, niche high performance computing applications” while still “running within warranty covered norms, specifications and safe thermal envelope.”

The tests were run using SR Labs’ MIPS (Market Data In Process System) feed handling software. While multi-core chips are often leveraged to boost application performance, some applications are inherently single-threaded, and so benefit more from increased speed of each core. Market data feed handlers and exchange matching engines are two such applications.

For the geeks, the two “stacks under test” comprised:

– SR Labs MIPS In-Process Market Data Line Handler for TVITCH 4.1 
– CentOS 5.5, 64-bit Linux 
– IBM x3650 Server 
– Myricom 10G-PCIE2-8B2-2S Network Interface 
– Processor: 
SUT A: 2 x quad core Intel Xeon 5687 3.60 GHz (“Westmere”) 
SUT B: 2 x dual core Intel Xeon 5698 4.40 GHz (“Everest”)

The test harness for this project incorporated TS-Associates’ TipOff and Simena F16 Fiber Optic Tap for wire-based observation, along with TS-Associates’ Application Tap cards for precise in-process observation. A Symmetricom SyncServer S350 was the time source for the harness.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

Implementing Events-based Trading and Prediction Markets

By Jon Light, Senior Director of Product Management at Devexperts. The current surging interest in prediction markets is leading to a general reevaluation of this type of trading, with many financial services firms now questioning whether to offer events-based trading to their own users. To date, several high-profile firms have moved to incorporate prediction markets...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Tackling the Data Management Challenges of FATCA

As the July 1, 2014 deadline for compliance with the Foreign Account Tax Compliance Act – or FATCA – approaches, financial institutions around the world are working to ensure their data management and operational systems will meet the requirements of the US legislation. This report discusses the requirements of FATCA and how the legislation is...