About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Volker’s Rules: What’s the Current Deal with Exchange Data Fees?

Subscribe to our newsletter

By Volker Lainer, VP Product Management, GoldenSource.

Last week, the SEC and the US Justice department dotted the i’s and cross the t’s on an agreement that will lead to greater scrutiny of the fees charged by stock exchanges for market data. The move represents the latest attempt to try and resolve a long running dispute between exchanges and those who use their data regarding the hiking of fees.

Although exchanges have gradually added value to their data services, it remains indisputable that the information they distribute is not fit for use immediately. It still needs to be checked, repaired, validated, and organised before it can contribute to analysis or reporting by the various teams that use it, such as traders and risk managers. Exchanges and data vendors know that improving data quality in this way is costly and difficult. They are working on those data niggles, which is partly what is driving up costs, but third parties might be able to do it better and more economically.

This is where the new governance plan comes in. The SEC has tasked exchanges and FINRA, which are self-regulated organisations, to create a plan that will reduce the influence of large exchange groups in decision-making around the data and give non-exchange participants a bigger say. It aims to increase transparency and address inefficiencies. The SEC gave 90 days for market participants to pull the plan together, so we do not know what the new rules will look like yet. However, it is an undeniable step towards recognising that there is a monopoly and that there might be a better way of operating to the benefit of the whole industry.

In all likelihood, it will reduce the monopolisation of some of the major exchange groups to work on their data and level the playing field by making it easier for third parties to produce cheaper market data or to make it available in smaller bundles to firms that don’t need access to a global set of securities data. This will allow new data providers, who are currently prevented from entering that market, to compete with the exchanges to improve and augment the data.

It has the potential to shake up the whole data vending market. For example, there may eventually be a whole new layer of companies who take the data from exchanges and monetise it by working some magic into it with new technology that easily slices and dices it in different ways and identifies fresh metrics with which to correlate it.

It could become quite a specialism where specific providers pull out data for particular asset classes or regions, curate it and distribute it in a certain format. It could be a catalyst for more innovation in the area and lead to a system where you have the main data creators, then people who add value to that data. Imagine the benefit for specialist desks or investment firms if they were able to select their data sets from multiple sources as easily as building a Spotify playlist from various artists, published by numerous record labels. Additionally, it would add a lot of complexity to the whole ecosystem. Those who use exchange data must consider that, with more data providers to choose from, they must be nimble and be able to switch providers with ease.

With the ink still drying on the SEC and the US Justice department plan, Exchanges need to proceed with caution and take steps to increase their data quality with accurate, all-encompassing market data. The main reason behind exchanges charging more for their data was because they needed to develop additional revenue streams due to low volumes. However, in light of how much market volatility has increased in recent months, now is the perfect time for exchanges to invest in improving their data quality. All before the data is then curated, published and sold on-demand, meaning that like an artist on Spotify or an author on Kindle, the originator gets paid when their content is consumed. Exchanges unquestionably still have great firepower at their disposal, but they must take steps to revamp their approach to data in order to retain their market share.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to automate entity data management and due diligence to ensure efficiency, accuracy and compliance

Requesting, gathering, analysing and monitoring customer, vendor and partner entity data is time consuming, and often a tedious manual process. This can slow down customer relationships and expose financial institutions to risk from inaccurate, incomplete or outdated data – but there are solutions to these problems. This webinar will consider the challenges of sourcing and...

BLOG

Data Products: Transforming Fraud Detection in Financial Services

By Suki Dhuphar, Head of EMEA, Tamr. Cybercrime assumes many shapes and forms. As a result, it’s often challenging to identify fraudulent behaviour and subsequently address it. Traditional methods frequently fail to detect and combat illicit activities, leading to financial losses and eroded trust. Yet, even today, one of the most prevalent solutions to enhance...

EVENT

Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...