About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Benchmarking the Benchmarks

Subscribe to our newsletter

In the world of low latency, it seems that benchmarks are headline news. Having readily available figures showing xxx microseconds and yyy hundreds of thousands of updates per second is a pretty sure way to get some press coverage for one’s product. Indeed, I find myself asking of vendors who are pushing their new datafeed handler, or complex event processing engine, “So, got any benchmarks for it?”

Until recently, the only game in town for benchmarks was Peter Lankford’s imposingly named (and cleverly named) Securities Technology Analysis Center (Stac), which makes a business out of running independent benchmarks for vendors, detailing the exact software and hardware components (or stack) used in the benchmark test.

Of course, while Stac is independent, not all of the benchmarks it runs are published. If the results don’t stack up so well (excuse the pun), then the vendor sponsoring the benchmark isn’t likely to make them public. So the news from Stac, while authentic and useful, is really just the good news. Indeed, Stac’s Lankford does point out that the published benchmarks should augment, not replace, specific benchmarks conducted by end users.

Intel has now come on stream with its own Low Latency Lab, with StreamBase partner Datastream Analysis making use of it to benchmark calculations for algorithmic trading. Again, useful data to have. But since the lab is there to assist partners in porting their applications to the Intel architecture, one expects that any published results will show such endevours in good light.

It’s probably a pipe dream to expect any independent body to emerge that will really shake down low latency components and publish the results – the good, the bad and the ugly. For one thing, I suspect that support from vendors would be difficult to obtain.

But perhaps the next step might be the creation of some common benchmark standards, with input from financial markets practitioners, that would allow vendors, independent testers and end users to perform and compare results. Useful, and newsworthy, no?

Until next time … here’s some good music.

[tags]stac,securities technology analysis center,intel,data stream analysis,dsal,benchmarks[/tags]

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The future of market data – Harnessing cloud and AI for market data distribution and consumption

Market data is the lifeblood of trading, but as data volumes grow and real-time demands increase, traditional approaches to distribution and consumption are being pushed to their limits. Cloud technology and AI-driven solutions are rapidly transforming how financial institutions manage, process, and extract value from market data, offering greater scalability, efficiency, and intelligence. This webinar,...

BLOG

The AI Co-Pilot: How Wall Street is Really Using AI for a Competitive Edge

The headlines paint a dramatic picture: autonomous AI traders making split-second decisions, rendering human portfolio managers obsolete. But for technology professionals on the ground, the reality of artificial intelligence in capital markets is proving to be both more pragmatic and, in many ways, more powerful. Discussions at A-Team Group’s recent TradingTech Briefing New York revealed...

EVENT

AI in Data Management Summit New York City

Following the success of the 15th Data Management Summit NYC, A-Team Group are excited to announce our new event: AI in Data Management Summit NYC!

GUIDE

The Data Management Implications of Solvency II

Bombarded by a barrage of incoming regulations, data managers in Europe are looking for the ‘golden copy’ of regulatory requirements: the compliance solution that will give them most bang for the buck in meeting the demands of the rest of the regulations they are faced with. Solvency II may come close as this ‘golden regulation’:...