About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Perplexed By Percentages

Subscribe to our newsletter

A couple of weeks ago, together with Reuters, we conducted a survey of the marketplace to find out how well latency measurement is entrenched, and what users thought of datafeed providers and measurement tool vendors. And we asked them to comment on their current market data handling infrastructures too. 

One figure that came back – which I had an instant gut reaction against – is that 39 per cent of respondents reckon their systems are adequate to cope with increasing market data volumes. To me, this number seemed just too high, given what we are being told to expect about data rates from OPRA, etc. A further 31 per cent reckoned their systems would be able to cope following planned upgrades.

Contrast those figures with just 24 per cent suggesting that measurement/monitoring tools are currently adequate (the rest say that tools are not good enough, or are still investigating tools, including finding out what’s out there because they don’t know of any). There’s an old adage in the IT industry and I’ve heard it a lot of late. It’s that unless one can monitor/measure it, one can’t manage it.

Personally, I reckon many of the 39 per cent are large investors in silicon – not the silicon of microprocessors, but the silicon found in large buckets of sand.

Am I being harsh? Of course, I’d be delighted to hear more from you on this subject.

Until next time … here’s some good music.

[tags]OPRA,benchmarks,latency measurement,latency monitoring[/tags]

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Reviewing the Latency Landscape and the Next Generation of Ultra-Low Latency Infrastructure

Date: 17 September 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Ultra-low latency is no longer the preserve of a handful of proprietary trading firms. As new asset classes electronify, data volumes surge, and regulatory expectations around execution quality and resilience tighten, the performance demands on trading infrastructure are broadening...

BLOG

Bloomberg BQuant Wins A-Team AICM Best AI Solution for Historical Data Analysis Award

When global markets were roiled by the announcement of massive US trade tariffs, Bloomberg saw the amount of financial and other data that runs through its systems surge to 600 billion data points, almost double the 400 billion it manages on an average day. “These were just mind-blowingly large volumes of data,” says James Jarvis,...

EVENT

TEST Event page 1

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook – Second Edition

Need to know all the essentials about the regulations impacting data management? A-Team’s Regulatory Data Handbook is a great way to see at-a-glance: All the regulations that are impacting data management today A description of each regulation The impact each will have from a data and data management perspective Messages from sponsors with products related to...