About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Benchmarking the Benchmarks

Subscribe to our newsletter

In the world of low latency, it seems that benchmarks are headline news. Having readily available figures showing xxx microseconds and yyy hundreds of thousands of updates per second is a pretty sure way to get some press coverage for one’s product. Indeed, I find myself asking of vendors who are pushing their new datafeed handler, or complex event processing engine, “So, got any benchmarks for it?”

Until recently, the only game in town for benchmarks was Peter Lankford’s imposingly named (and cleverly named) Securities Technology Analysis Center (Stac), which makes a business out of running independent benchmarks for vendors, detailing the exact software and hardware components (or stack) used in the benchmark test.

Of course, while Stac is independent, not all of the benchmarks it runs are published. If the results don’t stack up so well (excuse the pun), then the vendor sponsoring the benchmark isn’t likely to make them public. So the news from Stac, while authentic and useful, is really just the good news. Indeed, Stac’s Lankford does point out that the published benchmarks should augment, not replace, specific benchmarks conducted by end users.

Intel has now come on stream with its own Low Latency Lab, with StreamBase partner Datastream Analysis making use of it to benchmark calculations for algorithmic trading. Again, useful data to have. But since the lab is there to assist partners in porting their applications to the Intel architecture, one expects that any published results will show such endevours in good light.

It’s probably a pipe dream to expect any independent body to emerge that will really shake down low latency components and publish the results – the good, the bad and the ugly. For one thing, I suspect that support from vendors would be difficult to obtain.

But perhaps the next step might be the creation of some common benchmark standards, with input from financial markets practitioners, that would allow vendors, independent testers and end users to perform and compare results. Useful, and newsworthy, no?

Until next time … here’s some good music.

[tags]stac,securities technology analysis center,intel,data stream analysis,dsal,benchmarks[/tags]

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: From Data to Alpha: AI Strategies for Taming Unstructured Data

Date: 16 April 2026 Time: 9:00am ET / 2:00pm London / 3:00pm CET Duration: 50 minutes Unstructured data and text now accounts for the majority of information flowing through financial markets organisations, spanning research content, corporate disclosures, communications, alternative data, and internal documents. While AI has created new opportunities to extract signals, many firms are...

BLOG

Nasdaq and AWS Deepen Partnership to Offer Calypso as a Managed Service

Nasdaq and Amazon Web Services (AWS) have expanded their strategic partnership, announcing that Nasdaq’s Calypso platform will now be available as a fully managed service on AWS. The move aims to provide financial institutions with a modernised, resilient, and scalable infrastructure for their capital markets and treasury operations. The offering sees Nasdaq managing the underlying...

EVENT

AI in Data Management Summit New York City

Following the success of the 15th Data Management Summit NYC, A-Team Group are excited to announce our new event: AI in Data Management Summit NYC!

GUIDE

Preparing For Primetime – How to Benefit from the Global LEI

They say time flies when you’re enjoying yourself, and so it seems the industry have been having a blast with its preparations for the introduction of the global legal entity identifier (LEI) next month. But now it’s time to get serious. To date, much of the industry debate has centred on the identifier itself: its...