About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Perplexed By Percentages

Subscribe to our newsletter

A couple of weeks ago, together with Reuters, we conducted a survey of the marketplace to find out how well latency measurement is entrenched, and what users thought of datafeed providers and measurement tool vendors. And we asked them to comment on their current market data handling infrastructures too. 

One figure that came back – which I had an instant gut reaction against – is that 39 per cent of respondents reckon their systems are adequate to cope with increasing market data volumes. To me, this number seemed just too high, given what we are being told to expect about data rates from OPRA, etc. A further 31 per cent reckoned their systems would be able to cope following planned upgrades.

Contrast those figures with just 24 per cent suggesting that measurement/monitoring tools are currently adequate (the rest say that tools are not good enough, or are still investigating tools, including finding out what’s out there because they don’t know of any). There’s an old adage in the IT industry and I’ve heard it a lot of late. It’s that unless one can monitor/measure it, one can’t manage it.

Personally, I reckon many of the 39 per cent are large investors in silicon – not the silicon of microprocessors, but the silicon found in large buckets of sand.

Am I being harsh? Of course, I’d be delighted to hear more from you on this subject.

Until next time … here’s some good music.

[tags]OPRA,benchmarks,latency measurement,latency monitoring[/tags]

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Navigating the Build vs Buy Dilemma: Cloud Strategies for Accelerating Quantitative Research

Date: 20 May 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes For many quantitative trading firms and asset managers, building a self-provisioned historical market data environment remains one of the most time-consuming and resource-intensive steps in establishing a new research capability. Sourcing data, normalising symbologies, handling corporate actions and maintaining...

BLOG

When Margin Moves Upstream: How TT is Reworking Trading Decisions After the OpenGamma Deal

More than a month after completing its acquisition of OpenGamma, Trading Technologies is beginning to articulate how the deal is intended to change the way firms think about margin, capital efficiency, and trading decision-making. Rather than positioning margin as a downstream risk or treasury concern, TT is now framing capital efficiency as a front-office variable...

EVENT

TEST Event page 2

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook – Third Edition

Need to know all the essentials about the regulations impacting data management? Welcome to the third edition of our A-Team Regulatory Data Handbook which provides all the essentials about regulations impacting data management. A-Team’s series of Regulatory Data Handbooks are a great way to see at-a-glance: All the regulations that are impacting data management today...