A-Team Insight Blogs

Volker’s Rules: What’s the Current Deal with Exchange Data Fees?

Share article

By Volker Lainer, VP Product Management, GoldenSource.

Last week, the SEC and the US Justice department dotted the i’s and cross the t’s on an agreement that will lead to greater scrutiny of the fees charged by stock exchanges for market data. The move represents the latest attempt to try and resolve a long running dispute between exchanges and those who use their data regarding the hiking of fees.

Although exchanges have gradually added value to their data services, it remains indisputable that the information they distribute is not fit for use immediately. It still needs to be checked, repaired, validated, and organised before it can contribute to analysis or reporting by the various teams that use it, such as traders and risk managers. Exchanges and data vendors know that improving data quality in this way is costly and difficult. They are working on those data niggles, which is partly what is driving up costs, but third parties might be able to do it better and more economically.

This is where the new governance plan comes in. The SEC has tasked exchanges and FINRA, which are self-regulated organisations, to create a plan that will reduce the influence of large exchange groups in decision-making around the data and give non-exchange participants a bigger say. It aims to increase transparency and address inefficiencies. The SEC gave 90 days for market participants to pull the plan together, so we do not know what the new rules will look like yet. However, it is an undeniable step towards recognising that there is a monopoly and that there might be a better way of operating to the benefit of the whole industry.

In all likelihood, it will reduce the monopolisation of some of the major exchange groups to work on their data and level the playing field by making it easier for third parties to produce cheaper market data or to make it available in smaller bundles to firms that don’t need access to a global set of securities data. This will allow new data providers, who are currently prevented from entering that market, to compete with the exchanges to improve and augment the data.

It has the potential to shake up the whole data vending market. For example, there may eventually be a whole new layer of companies who take the data from exchanges and monetise it by working some magic into it with new technology that easily slices and dices it in different ways and identifies fresh metrics with which to correlate it.

It could become quite a specialism where specific providers pull out data for particular asset classes or regions, curate it and distribute it in a certain format. It could be a catalyst for more innovation in the area and lead to a system where you have the main data creators, then people who add value to that data. Imagine the benefit for specialist desks or investment firms if they were able to select their data sets from multiple sources as easily as building a Spotify playlist from various artists, published by numerous record labels. Additionally, it would add a lot of complexity to the whole ecosystem. Those who use exchange data must consider that, with more data providers to choose from, they must be nimble and be able to switch providers with ease.

With the ink still drying on the SEC and the US Justice department plan, Exchanges need to proceed with caution and take steps to increase their data quality with accurate, all-encompassing market data. The main reason behind exchanges charging more for their data was because they needed to develop additional revenue streams due to low volumes. However, in light of how much market volatility has increased in recent months, now is the perfect time for exchanges to invest in improving their data quality. All before the data is then curated, published and sold on-demand, meaning that like an artist on Spotify or an author on Kindle, the originator gets paid when their content is consumed. Exchanges unquestionably still have great firepower at their disposal, but they must take steps to revamp their approach to data in order to retain their market share.

Related content

WEBINAR

Upcoming Webinar: Data Standards – progress and case studies

Date: 10 November 2020 Time: 10:00am ET / 3:00pm London / 4:00pm CET Global data standards and identifiers are essential to business growth, market stability and cost reduction – but they can be challenging to implement, while a lack of consistency across jurisdictions has presented obstacles to global take-up. However, with regulators starting to sit...

BLOG

Hierarchies – Approach with Care

By Colin Gibson, Architecture & Data Management Consultant at Katano Limited. What is the item you find hardest to locate in a supermarket? You know where to find it in your local store, but what confounds you when you try to find it in a different chain? What about Bovril Yeast Extract (other yeast extract...

EVENT

RegTech Summit Virtual

We’re thrilled to introduce you to our new RegTech Summit Virtual event. Yes that’s right, all the fantastic content shared by A-Team’s unique community of practitioner experts that you’ve come to know and love from our RegTech Summit live events in London is now going to be made available to you online, so you can watch or listen at your leisure – whether that’s in your office, on your commute, or from the comfort of your own home.

GUIDE

Entity Data Management Handbook – Sixth Edition

High-profile and punitive penalties handed out to large financial institutions for non-compliance with Anti-Money Laundering (AML) and Know Your Customer (KYC) regulations have catapulted entity data management up the business agenda. So, too, have industry and government reports on the staggering sums of money laundered on a global basis. Less apparent, but equally important, are...