About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Exchanges Can Justify High Fees – But Only if They are Adding Value

Subscribe to our newsletter

By Neill Vanlint, Head of Global Sales & Client Operations, GoldenSource.

Across global financial markets, it is hard to think of a more contentious issue right now than the prices exchanges are charging for their trading data. Some high-profile potential acquisitions, including the London Stock Exchange’s $27 billion bid to buy Refinitiv, mean there could soon be more market data monopolies than ever before. Let’s face it, regardless of the providers, market data has never exactly been cheap, but prices are getting higher for traders. And with just a few firms currently dominating, there is no question that banks need to ensure they are getting more for their money.

Prior to the financial crisis, lower capital requirements meant trade volumes were booming and the markets were awash with cheap data. This is a far cry from today, with more stringent capital rules putting a real squeeze on risk taking. As a consequence, trade volumes are down, and a reduction in income from trading activity has forced exchanges to develop additional revenue streams. At the same time, regulation, among other things, is requiring firms to consume broader data sets more frequently.

From fancy colocation servers for high speed traders, to trendy trading software to stamp out market abuse, exchanges go way beyond being a place for companies to raise money by listing securities. However, new services do not change the fact that a continued tightening of belts means that financial institutions are not going to jump on the idea of paying excessive data fees unless the exchanges can demonstrate they are adding value.

Despite all the developments and sophistication of the data sets and the technology used to publish them, it remains an indisputable fact that the information distributed from the exchanges is never immediately fit for use. It still needs to be checked, repaired, validated and organised before it can contribute to analysis or reporting by the various teams that use it, such as traders and risk managers.

And herein lies the problem – all these operational data niggles feed into the overall cost. But even if there is an enforced cap on market data prices from exchanges, or even if banks ‘pool’ their data collectively and make it available for less, this still won’t make a drastic difference.

The only way that investment firms can plan for and manage data costs from exchanges is to underpin their market data strategy with an infrastructure that makes data operations as efficient as possible, and oversee it with data governance policies that ensure the data is being used judiciously. From the perspective of an exchange, members continue to demand access not just to pricing, but reference and corporate actions data from listed firms, regardless of the cost.

The trouble is that in the modern world of equity trading, members are increasingly dependent on small basis point movements and instantaneous reaction. As such, there has never been a greater need for exchanges to provide accurate, all-encompassing market data. After all, if the price of the data goes up, so too should the quality.

Moving forward, in an attempt to make better informed decisions, analysts and traders are constantly seeking alternative data sets. And there is no question that better quality data is at the heart of the LSE’s interest in Refinitiv. Regardless of the other drivers behind this potential acquisition and others before it, the reality is that a smaller group of very dominant exchanges have greater data firepower at their disposal. With this in mind, financial institutions need to remember one thing – data is a premium staple rather than a commodity. It comes at a price, and the more value that technology is seen to be able to derive from it, the greater the premium that exchanges will be ‘justified’ charging for it.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

The New ROI: How Cloud Data is Reshaping Performance and Strategy in Financial Markets

The conversation around cloud adoption in financial markets has fundamentally changed. The era of tentative migration and justifying projects based on CAPEX vs. OPEX is over. As a new report from LSEG, “Cloud Strategies in Financial Services,” confirms, the cloud is now a strategic default. But this maturity brings a new, more complex set of...

EVENT

Eagle Alpha Alternative Data Conference, New York, hosted by A-Team Group

Now in its 8th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...