About a-team Marketing Services

A-Team Insight Blogs

Benchmarking the Benchmarks

Subscribe to our newsletter

In the world of low latency, it seems that benchmarks are headline news. Having readily available figures showing xxx microseconds and yyy hundreds of thousands of updates per second is a pretty sure way to get some press coverage for one’s product. Indeed, I find myself asking of vendors who are pushing their new datafeed handler, or complex event processing engine, “So, got any benchmarks for it?”

Until recently, the only game in town for benchmarks was Peter Lankford’s imposingly named (and cleverly named) Securities Technology Analysis Center (Stac), which makes a business out of running independent benchmarks for vendors, detailing the exact software and hardware components (or stack) used in the benchmark test.

Of course, while Stac is independent, not all of the benchmarks it runs are published. If the results don’t stack up so well (excuse the pun), then the vendor sponsoring the benchmark isn’t likely to make them public. So the news from Stac, while authentic and useful, is really just the good news. Indeed, Stac’s Lankford does point out that the published benchmarks should augment, not replace, specific benchmarks conducted by end users.

Intel has now come on stream with its own Low Latency Lab, with StreamBase partner Datastream Analysis making use of it to benchmark calculations for algorithmic trading. Again, useful data to have. But since the lab is there to assist partners in porting their applications to the Intel architecture, one expects that any published results will show such endevours in good light.

It’s probably a pipe dream to expect any independent body to emerge that will really shake down low latency components and publish the results – the good, the bad and the ugly. For one thing, I suspect that support from vendors would be difficult to obtain.

But perhaps the next step might be the creation of some common benchmark standards, with input from financial markets practitioners, that would allow vendors, independent testers and end users to perform and compare results. Useful, and newsworthy, no?

Until next time … here’s some good music.

[tags]stac,securities technology analysis center,intel,data stream analysis,dsal,benchmarks[/tags]

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Trade South Africa: Considerations for Connecting to and Trading the Johannesburg Markets

Date: 28 November 2023 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Interest among the international institutional community in trading South African markets is on the rise. With connectivity, data and analytics options for trading on the Johannesburg Stock Exchange growing more sophisticated, and the emergence of A2X as a credible...

BLOG

Navigating the Future of Cloud-Based Market Data Delivery Solutions

Market data infrastructure’s transition to the cloud has gathered significant pace recently, driven by the growing recognition of benefits such as enhanced accessibility, flexible distribution, superior scalability, increased efficiency, and the potential for considerable cost reductions in both sourcing and managing data. As use cases for market data in the cloud grow, notably in areas...

EVENT

RegTech Summit New York

Now in its 7th year, the RegTech Summit in New York will bring together the regtech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Institutional Digital Assets Handbook 2023

After initial hesitancy, interest in digital assets from institutional market participants has grown over the past three to four years. Early focus inevitably centred on the market opportunities presented by bitcoin and other cryptocurrencies. But this has evolved into a broad acceptance of a potentially meaningful role for digital assets in institutional markets. It’s now...