A-Team Insight Blogs

Exchanges Can Justify High Fees – But Only if They are Adding Value

Share article

By Neill Vanlint, Head of Global Sales & Client Operations, GoldenSource.

Across global financial markets, it is hard to think of a more contentious issue right now than the prices exchanges are charging for their trading data. Some high-profile potential acquisitions, including the London Stock Exchange’s $27 billion bid to buy Refinitiv, mean there could soon be more market data monopolies than ever before. Let’s face it, regardless of the providers, market data has never exactly been cheap, but prices are getting higher for traders. And with just a few firms currently dominating, there is no question that banks need to ensure they are getting more for their money.

Prior to the financial crisis, lower capital requirements meant trade volumes were booming and the markets were awash with cheap data. This is a far cry from today, with more stringent capital rules putting a real squeeze on risk taking. As a consequence, trade volumes are down, and a reduction in income from trading activity has forced exchanges to develop additional revenue streams. At the same time, regulation, among other things, is requiring firms to consume broader data sets more frequently.

From fancy colocation servers for high speed traders, to trendy trading software to stamp out market abuse, exchanges go way beyond being a place for companies to raise money by listing securities. However, new services do not change the fact that a continued tightening of belts means that financial institutions are not going to jump on the idea of paying excessive data fees unless the exchanges can demonstrate they are adding value.

Despite all the developments and sophistication of the data sets and the technology used to publish them, it remains an indisputable fact that the information distributed from the exchanges is never immediately fit for use. It still needs to be checked, repaired, validated and organised before it can contribute to analysis or reporting by the various teams that use it, such as traders and risk managers.

And herein lies the problem – all these operational data niggles feed into the overall cost. But even if there is an enforced cap on market data prices from exchanges, or even if banks ‘pool’ their data collectively and make it available for less, this still won’t make a drastic difference.

The only way that investment firms can plan for and manage data costs from exchanges is to underpin their market data strategy with an infrastructure that makes data operations as efficient as possible, and oversee it with data governance policies that ensure the data is being used judiciously. From the perspective of an exchange, members continue to demand access not just to pricing, but reference and corporate actions data from listed firms, regardless of the cost.

The trouble is that in the modern world of equity trading, members are increasingly dependent on small basis point movements and instantaneous reaction. As such, there has never been a greater need for exchanges to provide accurate, all-encompassing market data. After all, if the price of the data goes up, so too should the quality.

Moving forward, in an attempt to make better informed decisions, analysts and traders are constantly seeking alternative data sets. And there is no question that better quality data is at the heart of the LSE’s interest in Refinitiv. Regardless of the other drivers behind this potential acquisition and others before it, the reality is that a smaller group of very dominant exchanges have greater data firepower at their disposal. With this in mind, financial institutions need to remember one thing – data is a premium staple rather than a commodity. It comes at a price, and the more value that technology is seen to be able to derive from it, the greater the premium that exchanges will be ‘justified’ charging for it.

Related content

WEBINAR

Recorded Webinar: The Transformation of Buy-Side Market Surveillance

Asset managers, hedge funds, insurance firms, and other buy-side firms globally are becoming more active in their approach to market surveillance, as regulatory pressure to up their game mounts. Buy-side firms are now building out their surveillance infrastructure as they seek to respond to the requirements posed by Dodd-Frank, MiFID II and the Market Abuse...

BLOG

Xignite’s Dubois: Covid-19 Expected to Accelerate Shift to Cloud for Market Data

The Covid-19 pandemic and its concomitant shift of much of the work force to home-working may force a re-evaluation of on-premises market data management. After an initial delay as the pandemic kicked in, financial institutions’ interest in cloud-based delivery of market data is now accelerating, according to Xignite CEO Stephane Dubois. Xignite has seen a...

EVENT

Data Management Summit London

Now in its 10th year, the Data Management Summit (DMS) in London explores how financial institutions are shifting from defensive to offensive data management strategies, to improve operational efficiency and revenue enhancing opportunities. We’ll be putting the business lens on data and deep diving into the data management capabilities needed to deliver on business outcomes.

GUIDE

Data Lineage Handbook 2019

Welcome to our latest handbook on data lineage, a critical concern for data managers working to achieve regulatory compliance, deliver operational gains, and provide meaningful value to the business. The handbook covers the complete scope of data lineage, with a view to helping you win management buy-in and budget, decide whether to build or buy...