About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Perplexed By Percentages

Subscribe to our newsletter

A couple of weeks ago, together with Reuters, we conducted a survey of the marketplace to find out how well latency measurement is entrenched, and what users thought of datafeed providers and measurement tool vendors. And we asked them to comment on their current market data handling infrastructures too. 

One figure that came back – which I had an instant gut reaction against – is that 39 per cent of respondents reckon their systems are adequate to cope with increasing market data volumes. To me, this number seemed just too high, given what we are being told to expect about data rates from OPRA, etc. A further 31 per cent reckoned their systems would be able to cope following planned upgrades.

Contrast those figures with just 24 per cent suggesting that measurement/monitoring tools are currently adequate (the rest say that tools are not good enough, or are still investigating tools, including finding out what’s out there because they don’t know of any). There’s an old adage in the IT industry and I’ve heard it a lot of late. It’s that unless one can monitor/measure it, one can’t manage it.

Personally, I reckon many of the 39 per cent are large investors in silicon – not the silicon of microprocessors, but the silicon found in large buckets of sand.

Am I being harsh? Of course, I’d be delighted to hear more from you on this subject.

Until next time … here’s some good music.

[tags]OPRA,benchmarks,latency measurement,latency monitoring[/tags]

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Navigating the Build vs Buy Dilemma: Cloud Strategies for Accelerating Quantitative Research

Date: 20 May 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes For many quantitative trading firms and asset managers, building a self-provisioned historical market data environment remains one of the most time-consuming and resource-intensive steps in establishing a new research capability. Sourcing data, normalising symbologies, handling corporate actions and maintaining...

BLOG

Beyond the Benchmark: Bloomberg Extends BCOM for a Fragmented Commodity Market

When the Bloomberg Commodity Index (BCOM) was launched in 1998, the architecture made sense for its time. Liquidity in commodity futures was concentrated in North American contracts denominated in US dollars, and the methodology was built accordingly. Twenty-eight years later the way global commodity markets operate and the way institutional investors want to access them...

EVENT

TEST Event page 1

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

The Global LEI System – A Solution for Entity Data?

The Global LEI System – or GLEIS – has been in development since the middle of last year. Development has been patchy at times, but much has been done, leaving fewer outstanding issues, but also raising new questions. What’s emerging is a structure for the GLEIS going forward, complete with a mechanism for registering and...