About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Time for Financial Institutions to Take Back Control of Market Data Costs

Subscribe to our newsletter

By Yann Bloch, Vice President of Product Management at NeoXam.

Brexit may be just around the corner, but it is market data spending that financial institutions are more interested in taking back control of right now. In fact, other than regulatory equivalence post the transition period, it is hard to think of a more prominent issue right now than the rising cost of market data.

According to analysis at the end of last year by Burton Taylor, global spend on market data topped $30 billion in 2019. With costs showing very little sign of coming down, at least in the short to medium term, now has to be the time for market participants to get a better grasp of not only what their costs could be at the end of the month, but also the precise areas of business consuming the most data.

The problem has been, and still is, seeking out those month-on-month cost anomalies. For example, why is it that fixed income and FX derivatives costs have all of a sudden doubled compared to the previous month? The trouble is that it is nigh on impossible to get accurate answers to questions like this because the vast majority of investment firms have no fool proof way of analysing how spending evolves over time. In certain cases, financial institutions can experience a 10%+ increase on their monthly market data vendor bills.

It is not hard to see why, as every small incremental cost mounts up fast. First there are the direct costs for one or more sets of data, which leads to billing getting far more complex. Sure, a market data vendor may be adding lots of different add-on services to help clients save money, but at the same time, they will also be adding on more costs. If this was not enough, there are also the indirect costs around data governance and regulatory compliance. New rules, such as the Fundamental Review of the Trading Book (FRTB), means that investment banks will have no choice but to consume a lot more data to be able to run models and back testing.

All this begs the question of how exactly can firms gain more control of their market data spending? A good place to start is trying to reduce waste. This involves firms making sure they do not request new sources of data from their vendors that they are not going to use. If data vendors charge for every single piece of data that the client requests, then the client needs to make sure they are going to act on this information.

Then there is the recycling of the data. Say an investment fund needs a new piece of data instantly, and also needs that same piece of data at the end of the day. If the fund manager already has the data, they surely do not need to request it again? It is all about being smarter about reusing whatever data the fund manager has received previously. After all, different trading desks are all consuming data and requesting information through the data management team, but it is hard for the trader acting on the data to work out how much the data actually costs. This is why being able to allocate these costs to different trading desks is key.

When all is said and done, the only way financial institutions can harbour any hopes of overcoming this longstanding data cost problem is by deriving more insights to ensure they a squeezing every last drop of value from their market data. Technological advancements mean that firms can now keep right on top of not just their direct data costs, like complex billing, but also the indirect costs around regulation. With so many other cost pressures across the business right now, it is time for financial institutions take advantage of new technologies to finally address the issue of rising market data costs that has already plagued the industry for too long.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: ESG data sourcing and management to meet your ESG strategy, objectives and timeline

Date: 11 June 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes ESG data plays a key role in research, fund product development, fund selection, asset selection, performance tracking, and client and regulatory reporting, yet it is not always easy to source and manage in a complete, transparent and timely manner....

BLOG

Solidatus Joins AWS Marketplace

Solidatus has joined the Amazon Web Services (AWS) Marketplace to bring connected data governance to AWS customers. With integrations to AWS services such as S3 and Redshift, and partnerships and integrations with data management vendors including Collibra, Snowflake, Alteryx, Google BigQuery, BigID, Denodo, Informatica, Oracle, Microsoft, and IBM Redhat, Solidatus can deliver an integrated data...

EVENT

Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...