About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Volker’s Rules: What’s the Current Deal with Exchange Data Fees?

Subscribe to our newsletter

By Volker Lainer, VP Product Management, GoldenSource.

Last week, the SEC and the US Justice department dotted the i’s and cross the t’s on an agreement that will lead to greater scrutiny of the fees charged by stock exchanges for market data. The move represents the latest attempt to try and resolve a long running dispute between exchanges and those who use their data regarding the hiking of fees.

Although exchanges have gradually added value to their data services, it remains indisputable that the information they distribute is not fit for use immediately. It still needs to be checked, repaired, validated, and organised before it can contribute to analysis or reporting by the various teams that use it, such as traders and risk managers. Exchanges and data vendors know that improving data quality in this way is costly and difficult. They are working on those data niggles, which is partly what is driving up costs, but third parties might be able to do it better and more economically.

This is where the new governance plan comes in. The SEC has tasked exchanges and FINRA, which are self-regulated organisations, to create a plan that will reduce the influence of large exchange groups in decision-making around the data and give non-exchange participants a bigger say. It aims to increase transparency and address inefficiencies. The SEC gave 90 days for market participants to pull the plan together, so we do not know what the new rules will look like yet. However, it is an undeniable step towards recognising that there is a monopoly and that there might be a better way of operating to the benefit of the whole industry.

In all likelihood, it will reduce the monopolisation of some of the major exchange groups to work on their data and level the playing field by making it easier for third parties to produce cheaper market data or to make it available in smaller bundles to firms that don’t need access to a global set of securities data. This will allow new data providers, who are currently prevented from entering that market, to compete with the exchanges to improve and augment the data.

It has the potential to shake up the whole data vending market. For example, there may eventually be a whole new layer of companies who take the data from exchanges and monetise it by working some magic into it with new technology that easily slices and dices it in different ways and identifies fresh metrics with which to correlate it.

It could become quite a specialism where specific providers pull out data for particular asset classes or regions, curate it and distribute it in a certain format. It could be a catalyst for more innovation in the area and lead to a system where you have the main data creators, then people who add value to that data. Imagine the benefit for specialist desks or investment firms if they were able to select their data sets from multiple sources as easily as building a Spotify playlist from various artists, published by numerous record labels. Additionally, it would add a lot of complexity to the whole ecosystem. Those who use exchange data must consider that, with more data providers to choose from, they must be nimble and be able to switch providers with ease.

With the ink still drying on the SEC and the US Justice department plan, Exchanges need to proceed with caution and take steps to increase their data quality with accurate, all-encompassing market data. The main reason behind exchanges charging more for their data was because they needed to develop additional revenue streams due to low volumes. However, in light of how much market volatility has increased in recent months, now is the perfect time for exchanges to invest in improving their data quality. All before the data is then curated, published and sold on-demand, meaning that like an artist on Spotify or an author on Kindle, the originator gets paid when their content is consumed. Exchanges unquestionably still have great firepower at their disposal, but they must take steps to revamp their approach to data in order to retain their market share.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unpacking Stablecoin Challenges for Financial Institutions

The stablecoin market is experiencing unprecedented growth, driven by emerging regulatory clarity, technological maturity, and rising global demand for a faster, more secure financial infrastructure. But with opportunity comes complexity, and a host of challenges that financial institutions need to address before they can unlock the promise of a more streamlined financial transaction ecosystem. These...

BLOG

Private Markets Data Opportunities Under the Microscope: Webinar Preview

As institutional asset managers accelerate their allocations into private markets, they often find themselves facing an alien landscape when it comes to data. Used to the data-driven systems that power public capital markets, investors in private markets, including private equity and private credit as well as alternatives such as property, must contend with greater opacity,...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...