The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Time for Financial Institutions to Take Back Control of Market Data Costs

Share article

By Yann Bloch, Vice President of Product Management at NeoXam.

Brexit may be just around the corner, but it is market data spending that financial institutions are more interested in taking back control of right now. In fact, other than regulatory equivalence post the transition period, it is hard to think of a more prominent issue right now than the rising cost of market data.

According to analysis at the end of last year by Burton Taylor, global spend on market data topped $30 billion in 2019. With costs showing very little sign of coming down, at least in the short to medium term, now has to be the time for market participants to get a better grasp of not only what their costs could be at the end of the month, but also the precise areas of business consuming the most data.

The problem has been, and still is, seeking out those month-on-month cost anomalies. For example, why is it that fixed income and FX derivatives costs have all of a sudden doubled compared to the previous month? The trouble is that it is nigh on impossible to get accurate answers to questions like this because the vast majority of investment firms have no fool proof way of analysing how spending evolves over time. In certain cases, financial institutions can experience a 10%+ increase on their monthly market data vendor bills.

It is not hard to see why, as every small incremental cost mounts up fast. First there are the direct costs for one or more sets of data, which leads to billing getting far more complex. Sure, a market data vendor may be adding lots of different add-on services to help clients save money, but at the same time, they will also be adding on more costs. If this was not enough, there are also the indirect costs around data governance and regulatory compliance. New rules, such as the Fundamental Review of the Trading Book (FRTB), means that investment banks will have no choice but to consume a lot more data to be able to run models and back testing.

All this begs the question of how exactly can firms gain more control of their market data spending? A good place to start is trying to reduce waste. This involves firms making sure they do not request new sources of data from their vendors that they are not going to use. If data vendors charge for every single piece of data that the client requests, then the client needs to make sure they are going to act on this information.

Then there is the recycling of the data. Say an investment fund needs a new piece of data instantly, and also needs that same piece of data at the end of the day. If the fund manager already has the data, they surely do not need to request it again? It is all about being smarter about reusing whatever data the fund manager has received previously. After all, different trading desks are all consuming data and requesting information through the data management team, but it is hard for the trader acting on the data to work out how much the data actually costs. This is why being able to allocate these costs to different trading desks is key.

When all is said and done, the only way financial institutions can harbour any hopes of overcoming this longstanding data cost problem is by deriving more insights to ensure they a squeezing every last drop of value from their market data. Technological advancements mean that firms can now keep right on top of not just their direct data costs, like complex billing, but also the indirect costs around regulation. With so many other cost pressures across the business right now, it is time for financial institutions take advantage of new technologies to finally address the issue of rising market data costs that has already plagued the industry for too long.

Related content

WEBINAR

Upcoming Webinar: How to establish data quality and data governance for analytics

Date: 22 April 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data quality has been a perennial problem for financial institutions for many years, but this needs to change as firms become increasingly reliant on accurate analytics to deliver business opportunity and competitive advantage. New approaches to data quality can...

BLOG

FINOS and Goldman Sachs Release Legend Open Source Data Management Platform

In pursuit of data collaboration and standardisation across capital markets, the Fintech Open Source Foundation (FINOS) and Goldman Sachs have released Legend, the bank’s flagship data management and data governance platform. The source code for five of the platform’s modules is available from FINOS. Goldman Sachs contributed its PURE logical modelling language and Alloy platform...

EVENT

RegTech Summit Virtual

The RegTech Summit Virtual is a global online event that will be held in May 2021 with an exceptional guest speaker line up of RegTech practitioners, regulators, start-ups and solution providers to collaborate and discuss innovative and effective approaches for building a better regulatory environment.

GUIDE

Regulatory Data Handbook 2020/2021 – Eighth Edition

This eighth edition of A-Team Group’s Regulatory Data Handbook is a ‘must-have’ for capital markets participants during this period of unprecedented change. Available free of charge, it profiles every regulation that impacts capital markets data management practices giving you: A detailed overview of each regulation with key dates, data and data management implications, links to...