About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Perplexed By Percentages

Subscribe to our newsletter

A couple of weeks ago, together with Reuters, we conducted a survey of the marketplace to find out how well latency measurement is entrenched, and what users thought of datafeed providers and measurement tool vendors. And we asked them to comment on their current market data handling infrastructures too. 

One figure that came back – which I had an instant gut reaction against – is that 39 per cent of respondents reckon their systems are adequate to cope with increasing market data volumes. To me, this number seemed just too high, given what we are being told to expect about data rates from OPRA, etc. A further 31 per cent reckoned their systems would be able to cope following planned upgrades.

Contrast those figures with just 24 per cent suggesting that measurement/monitoring tools are currently adequate (the rest say that tools are not good enough, or are still investigating tools, including finding out what’s out there because they don’t know of any). There’s an old adage in the IT industry and I’ve heard it a lot of late. It’s that unless one can monitor/measure it, one can’t manage it.

Personally, I reckon many of the 39 per cent are large investors in silicon – not the silicon of microprocessors, but the silicon found in large buckets of sand.

Am I being harsh? Of course, I’d be delighted to hear more from you on this subject.

Until next time … here’s some good music.

[tags]OPRA,benchmarks,latency measurement,latency monitoring[/tags]

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Enhancing trader efficiency with interoperability – Innovative solutions for automated and streamlined trader desktop and workflows

Traders today are expected to navigate increasingly complex markets using workflows that often lag behind the pace of change. Disconnected systems, manual processes, and fragmented user experiences create hidden inefficiencies that directly impact performance and risk management. Firms that can streamline and modernise the trader desktop are gaining a tangible edge – both in speed...

BLOG

Bloomberg BQuant Wins A-Team AICM Best AI Solution for Historical Data Analysis Award

When global markets were roiled by the announcement of massive US trade tariffs, Bloomberg saw the amount of financial and other data that runs through its systems surge to 600 billion data points, almost double the 400 billion it manages on an average day. “These were just mind-blowingly large volumes of data,” says James Jarvis,...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Applications of Reference Data to the Middle Office

Increasing volumes and the complexity of reference data in the post-crisis environment have left the middle office struggling to meet the requirements of the current market order. Middle office functions must therefore be robust enough to be able to deal with the spectre of globalisation, an increase in the use of esoteric security types and...