A-Team Insight Blogs

Volker’s Rules: What’s the Current Deal with Exchange Data Fees?

By Volker Lainer, VP Product Management, GoldenSource.

Last week, the SEC and the US Justice department dotted the i’s and cross the t’s on an agreement that will lead to greater scrutiny of the fees charged by stock exchanges for market data. The move represents the latest attempt to try and resolve a long running dispute between exchanges and those who use their data regarding the hiking of fees.

Although exchanges have gradually added value to their data services, it remains indisputable that the information they distribute is not fit for use immediately. It still needs to be checked, repaired, validated, and organised before it can contribute to analysis or reporting by the various teams that use it, such as traders and risk managers. Exchanges and data vendors know that improving data quality in this way is costly and difficult. They are working on those data niggles, which is partly what is driving up costs, but third parties might be able to do it better and more economically.

This is where the new governance plan comes in. The SEC has tasked exchanges and FINRA, which are self-regulated organisations, to create a plan that will reduce the influence of large exchange groups in decision-making around the data and give non-exchange participants a bigger say. It aims to increase transparency and address inefficiencies. The SEC gave 90 days for market participants to pull the plan together, so we do not know what the new rules will look like yet. However, it is an undeniable step towards recognising that there is a monopoly and that there might be a better way of operating to the benefit of the whole industry.

In all likelihood, it will reduce the monopolisation of some of the major exchange groups to work on their data and level the playing field by making it easier for third parties to produce cheaper market data or to make it available in smaller bundles to firms that don’t need access to a global set of securities data. This will allow new data providers, who are currently prevented from entering that market, to compete with the exchanges to improve and augment the data.

It has the potential to shake up the whole data vending market. For example, there may eventually be a whole new layer of companies who take the data from exchanges and monetise it by working some magic into it with new technology that easily slices and dices it in different ways and identifies fresh metrics with which to correlate it.

It could become quite a specialism where specific providers pull out data for particular asset classes or regions, curate it and distribute it in a certain format. It could be a catalyst for more innovation in the area and lead to a system where you have the main data creators, then people who add value to that data. Imagine the benefit for specialist desks or investment firms if they were able to select their data sets from multiple sources as easily as building a Spotify playlist from various artists, published by numerous record labels. Additionally, it would add a lot of complexity to the whole ecosystem. Those who use exchange data must consider that, with more data providers to choose from, they must be nimble and be able to switch providers with ease.

With the ink still drying on the SEC and the US Justice department plan, Exchanges need to proceed with caution and take steps to increase their data quality with accurate, all-encompassing market data. The main reason behind exchanges charging more for their data was because they needed to develop additional revenue streams due to low volumes. However, in light of how much market volatility has increased in recent months, now is the perfect time for exchanges to invest in improving their data quality. All before the data is then curated, published and sold on-demand, meaning that like an artist on Spotify or an author on Kindle, the originator gets paid when their content is consumed. Exchanges unquestionably still have great firepower at their disposal, but they must take steps to revamp their approach to data in order to retain their market share.

Related content

WEBINAR

Recorded Webinar: Data management for analytics and market surveillance

The financial services industry is starting to harness the potential of Big Data analytics to gain insights that can yield competitive advantage. At the same time, market surveillance has emerged as a major requirement for trading firms across the board as regulations like MiFID II and Market Abuse Regulation (MAR) push both buy- and sell-side...

BLOG

Bank of England Progresses Data Standards Plan to Improve Data Collection at Lower Cost to Industry

The Bank of England is pressing ahead with plans to deliver data standards for data collection based on its vision that ‘the bank gets the data it needs to fulfil its mission, at the lowest possible cost to industry’. It acknowledges the challenges of setting standards, such as coordinating collective action, and says it will...

EVENT

RegTech Summit London

Now in its 5th year, the RegTech Summit in London explores how the European financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...