The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Volker’s Rules: What’s the Current Deal with Exchange Data Fees?

By Volker Lainer, VP Product Management, GoldenSource.

Last week, the SEC and the US Justice department dotted the i’s and cross the t’s on an agreement that will lead to greater scrutiny of the fees charged by stock exchanges for market data. The move represents the latest attempt to try and resolve a long running dispute between exchanges and those who use their data regarding the hiking of fees.

Although exchanges have gradually added value to their data services, it remains indisputable that the information they distribute is not fit for use immediately. It still needs to be checked, repaired, validated, and organised before it can contribute to analysis or reporting by the various teams that use it, such as traders and risk managers. Exchanges and data vendors know that improving data quality in this way is costly and difficult. They are working on those data niggles, which is partly what is driving up costs, but third parties might be able to do it better and more economically.

This is where the new governance plan comes in. The SEC has tasked exchanges and FINRA, which are self-regulated organisations, to create a plan that will reduce the influence of large exchange groups in decision-making around the data and give non-exchange participants a bigger say. It aims to increase transparency and address inefficiencies. The SEC gave 90 days for market participants to pull the plan together, so we do not know what the new rules will look like yet. However, it is an undeniable step towards recognising that there is a monopoly and that there might be a better way of operating to the benefit of the whole industry.

In all likelihood, it will reduce the monopolisation of some of the major exchange groups to work on their data and level the playing field by making it easier for third parties to produce cheaper market data or to make it available in smaller bundles to firms that don’t need access to a global set of securities data. This will allow new data providers, who are currently prevented from entering that market, to compete with the exchanges to improve and augment the data.

It has the potential to shake up the whole data vending market. For example, there may eventually be a whole new layer of companies who take the data from exchanges and monetise it by working some magic into it with new technology that easily slices and dices it in different ways and identifies fresh metrics with which to correlate it.

It could become quite a specialism where specific providers pull out data for particular asset classes or regions, curate it and distribute it in a certain format. It could be a catalyst for more innovation in the area and lead to a system where you have the main data creators, then people who add value to that data. Imagine the benefit for specialist desks or investment firms if they were able to select their data sets from multiple sources as easily as building a Spotify playlist from various artists, published by numerous record labels. Additionally, it would add a lot of complexity to the whole ecosystem. Those who use exchange data must consider that, with more data providers to choose from, they must be nimble and be able to switch providers with ease.

With the ink still drying on the SEC and the US Justice department plan, Exchanges need to proceed with caution and take steps to increase their data quality with accurate, all-encompassing market data. The main reason behind exchanges charging more for their data was because they needed to develop additional revenue streams due to low volumes. However, in light of how much market volatility has increased in recent months, now is the perfect time for exchanges to invest in improving their data quality. All before the data is then curated, published and sold on-demand, meaning that like an artist on Spotify or an author on Kindle, the originator gets paid when their content is consumed. Exchanges unquestionably still have great firepower at their disposal, but they must take steps to revamp their approach to data in order to retain their market share.

Related content

WEBINAR

Recorded Webinar: Best practices for metadata management

Metadata has become central to financial firms as a means of enriching, discovering and effectively using both business and technical data. Its management is equally important, particularly in capital markets applications such as data lineage and data governance, which require a clear view of data and its provenance, and the ability to scale and support...

BLOG

Delta Capita and QAssure Partner to Drive Innovation

Delta Capita, a provider of managed services, technology solutions and consulting to financial institutions, has partnered QAssure Technologies, an independent software testing company headquartered in Singapore, to deliver an open innovation platform designed to accelerate innovation and address the challenges  that delay innovation, such as operational complexity and ever-changing regulation. The companies say proof of...

EVENT

Data Management Summit USA Virtual (Redirected)

The highly successful Data Management Summit USA Virtual was held in September 2020 and explored how sell side and buy side financial institutions are navigating the global crisis and adapting their data strategies to manage in today’s new normal environment.

GUIDE

Regulatory Data Handbook 2021/2022 – Ninth Edition

Welcome to the ninth edition of A-Team Group’s Regulatory Data Handbook, a publication dedicated to helping you gain a full understanding of regulations related to your organisation from the details of requirements to best practice implementation. This edition of the handbook includes a focus on regulations being rolled out to bring order and standardisation to...