About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Time for Financial Institutions to Take Back Control of Market Data Costs

Subscribe to our newsletter

By Yann Bloch, Vice President of Product Management at NeoXam.

Brexit may be just around the corner, but it is market data spending that financial institutions are more interested in taking back control of right now. In fact, other than regulatory equivalence post the transition period, it is hard to think of a more prominent issue right now than the rising cost of market data.

According to analysis at the end of last year by Burton Taylor, global spend on market data topped $30 billion in 2019. With costs showing very little sign of coming down, at least in the short to medium term, now has to be the time for market participants to get a better grasp of not only what their costs could be at the end of the month, but also the precise areas of business consuming the most data.

The problem has been, and still is, seeking out those month-on-month cost anomalies. For example, why is it that fixed income and FX derivatives costs have all of a sudden doubled compared to the previous month? The trouble is that it is nigh on impossible to get accurate answers to questions like this because the vast majority of investment firms have no fool proof way of analysing how spending evolves over time. In certain cases, financial institutions can experience a 10%+ increase on their monthly market data vendor bills.

It is not hard to see why, as every small incremental cost mounts up fast. First there are the direct costs for one or more sets of data, which leads to billing getting far more complex. Sure, a market data vendor may be adding lots of different add-on services to help clients save money, but at the same time, they will also be adding on more costs. If this was not enough, there are also the indirect costs around data governance and regulatory compliance. New rules, such as the Fundamental Review of the Trading Book (FRTB), means that investment banks will have no choice but to consume a lot more data to be able to run models and back testing.

All this begs the question of how exactly can firms gain more control of their market data spending? A good place to start is trying to reduce waste. This involves firms making sure they do not request new sources of data from their vendors that they are not going to use. If data vendors charge for every single piece of data that the client requests, then the client needs to make sure they are going to act on this information.

Then there is the recycling of the data. Say an investment fund needs a new piece of data instantly, and also needs that same piece of data at the end of the day. If the fund manager already has the data, they surely do not need to request it again? It is all about being smarter about reusing whatever data the fund manager has received previously. After all, different trading desks are all consuming data and requesting information through the data management team, but it is hard for the trader acting on the data to work out how much the data actually costs. This is why being able to allocate these costs to different trading desks is key.

When all is said and done, the only way financial institutions can harbour any hopes of overcoming this longstanding data cost problem is by deriving more insights to ensure they a squeezing every last drop of value from their market data. Technological advancements mean that firms can now keep right on top of not just their direct data costs, like complex billing, but also the indirect costs around regulation. With so many other cost pressures across the business right now, it is time for financial institutions take advantage of new technologies to finally address the issue of rising market data costs that has already plagued the industry for too long.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Streamlining trading and investment processes with data standards and identifiers

3 June 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Financial institutions are integrating not only greater volumes of data for use across their organisation but also more varieties of data. As well, that data is being applied to more use cases than ever before, especially regulatory compliance and ESG integration....

BLOG

GLEIF Begins a New Decade in Growth Mode

The Global Legal Identifier Foundation (GLEIF) enters its second decade this month with its novel system of identifiers for everything from companies and their financial instruments to real assets fast becoming a global standard. While the next five years are expected to see yet more entities join the GLEIF’s open data project, the organisation’s immediate...

EVENT

Future of Capital Markets Tech Summit: Buy AND Build, London

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...