About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Unpacking the FCA Wholesale Data Market Study: Insights from Industry Insiders

Subscribe to our newsletter

Market participants were left scratching their heads in response to the UK Financial Conduct Authority’s (FCA) decision not to act to address the high cost of data following the February 29 release of its Wholesale Data Market Study. The study was the end result of a process launched in March last year amid concerns that competition was not working as it should, leading to higher costs for investors, less effective investment decisions and high barriers to entry for new providers entering these markets.

While many observers have complemented the FCA on the depth and detail of the study, the regulator’s decision not to take any anti-competitive action has left industry insiders somewhat nonplussed.

“The FCA devoted substantial resources and effort to this study, posing all the right questions—a notable change from regulators’ frequent habit of asking the wrong ones,” comments Mike Carrodus, CEO and Founder of Substantive Research, a research and analytics provider for the buy side. “Consequently, we now have a regulatory document that acknowledges market dysfunction. However, this dysfunction, according to the FCA’s assessment, does not justify any regulatory intervention.”

Unintended consequences

By way of background, in March 2023, having conducted a trade data review and published its findings report – and following persistent user concerns about how wholesale data markets were working – the FCA launched its comprehensive Wholesale Data Market Study, to look at competition in the provision of three separate but inter-linked markets: benchmarks across several asset classes; credit ratings data from credit ratings agencies (CRAs) and their affiliates; and market data vendor (MDV) services.

Despite identifying multiple areas where competition was not working well, the regulator ruled out any significant intervention at this stage, citing concern about potential unintended consequences as their main reason.

“Various factors probably held the FCA back from taking definitive action, one of which was data quality,” says Carrodus. “About 70% of benchmark users indicated they encountered no quality issues, and 90% of credit ratings users had positive or neutral feedback regarding the accuracy and quality of ratings.”

Given the unpredictable nature of how new regulations can impact market structure, the FCA’s reluctance to intervene is perhaps understandable. Mike Powell, CEO of electronic trading solutions specialist Rapid Addition, suggests that it was regulatory change that led to ballooning market data costs in the first place.

“Much of the market data cost issue stems from the unintended consequence of what the regulators thought would improve the market,” he says. “The introduction of competition among exchanges and the formalisation of regulations around MTFs led to the emergence of numerous alternative trading venues targeting the same liquidity pools. More venues equate to more market data, as well as new costs as firms had to connect to those venues to trade. Furthermore, competition between venues created greater opportunities for arbitrage, prompting a significant growth in algo trading, resulting in a greater number of smaller, algo-executed trades, which also massively increased the volume of market data. And when best execution transparency rules came into play, the need for comprehensive data to substantiate best execution claims fuelled the growth even further.”

All of this had a massive impact on data costs, says Powell. “The shift toward electronic trading and the transformation of business models in our industry have led firms to leverage data in new and diverse ways. This hasn’t gone unnoticed by the exchanges, which are increasingly turning into data-centric enterprises. This is a direct outcome of market structure changes that have unexpectedly given rise to a massive data-driven business, sending the costs of market participation through the roof at a time of shrinking commissions, narrower spreads and rising regulatory costs.”

Reasonable commercial basis?

Back in June 2021, the European Securities and Markets Authority (ESMA), published guidelines on the requirements for national competent authorities, trading venues, approved publication arrangements, consolidated tape providers and systematic internalisers to publish market data on a ‘Reasonable Commercial Basis’ (RCB).

These guidelines intended to control the cost of market data and improve market transparency. However, the general consensus is that they have not been particularly effective. Fees have continued to rise, and pricing is still opaque. The anticipated creation of a consolidated tape — a unified data stream providing information on prices and trading volume — has still not materialised, leaving the issue unresolved.

The emphasis now – and the focus of the FCA’s study – has shifted towards private companies, namely benchmark providers, rating agencies, and market data providers. Some argue however that such entities are caught in a bind. To distribute market data, especially from major exchanges, and to incorporate data into tradable financial products like ETFs, incurs significant licensing costs. MDVs also face substantial expenses in creating global distribution networks that include ticker plants, connectivity, resiliency measures, and final delivery mechanisms.

“Aggregating and transforming raw exchange data from our highly fragmented markets into a product that’s usable by banks, brokers, or asset managers is an incredibly expensive endeavour,” points out Powell. “The fundamental reasons for these steep costs don’t seem to be fully addressed in the FCA’s study. So, when we discuss the barriers that smaller firms face, the question arises: is it because giants like Refinitiv and Bloomberg are monopolising the market, or is it due to their substantial long-term investments in managing and absorbing these costs?”

The need for transparency

One common criticism levelled at MDVs, particularly from the buy side, is the opacity and inconsistency of their pricing, leading to calls for the FCA to do more to ensure an open and level playing field.

“When you examine the pricing of market data providers, it’s clear that it’s not only wildly inconsistent but it also seems opportunistic,” contends Carrodus. “Some degree of pricing flexibility is standard for any firm. But it’s the magnitude of inconsistency that highlights the peculiar state of this market. Currently, pricing discussions are bilateral, hinging on various factors like user numbers, location, and usage, leading to custom pricing models. Unfortunately, the consistency of these vendor negotiations fluctuates—sometimes certain elements are omitted, and at other times, new ones are introduced.”

He continues: “Customers don’t want to feel like they’re in a vacuum. They do their very best in these bilateral negotiations, but they have no understanding of what is going on around them and no comfort level that what’s being told to them is what’s being told to their peers. And that transparency is what they were really looking for some help on. Additionally, there was hope that the FCA, given the extensive work done around this study, might have introduced a novel solution to these issues—a hope that, regrettably, has not been fulfilled.”

Of course, transparency and consistency of pricing sounds great in theory. But how achievable is it in the wholesale data market?

“In practice, use cases for data vary massively both between and within financial organisations,” says Powell. “Sure, we could use more transparency, but it’s a complex task to price the consumption of digital data when everyone’s needs differ—especially in machine-readable formats, factoring in the base costs of the supporting infrastructure. For instance, whether you only want data on ten foreign exchange pairs to fuel a treasury platform or complete regional cash equity coverage for an agency desk, the provider still needs to maintain the infrastructure to efficiently deliver that data to you, provide 24/5 support, and ensure product enhancements and updates are in place whenever changes occur at the data source. So, while pricing could indeed be more transparent, one has to account for the myriad of variables involved.”

Usage reporting

The conversation about data costs isn’t just about transparency or the simplicity of pricing, but also about the viability of data providers, whose economic interests hinge on their capacity to commercialise data. According to Suzanne Lock, CEO of EOSE Data, an independent team of data specialists, this could be compromised if consumers were granted unlimited rights to how the data is used.

“I understand there’s some frustration from data consumers about possibly paying for data multiple times, especially for different uses,” she says. “But we have to consider how this affects the data providers. Their ability to sell the data is crucial, and it’s at risk if everyone has the right to redistribute or create similar products. That’s why it makes sense for there to be different prices for different uses. A large company using the data across 50 sites should pay more than a small business using it in just a couple of places.”

Lock points out that it’s not only data suppliers, but also data consumers who are less than transparent. “It’s a complex issue,” she says. “Providers are simply trying to ensure that charges reflect actual usage, while customers are often reluctant to reveal the full extent of their data consumption. This issue cuts both ways. Data consumers point out that pricing is obscure, yet their own reporting can be equally ambiguous. The FCA has recognised that having clear definitions of terms is essential. But if you demand such clarity, you must also be ready for the responsibility that comes with it, which includes accurate reporting. If the FCA is going to regulate pricing and the conditions of data sales, it stands to reason they should also oversee the purchasing conditions. Imposing obligations on data providers requires a reciprocal level of transparency and accuracy from the consumers in reporting how they use the data. How can providers manage compliance if consumers don’t report their usage accurately?”

Data licensing

Lock highlights another aspect to understanding data usage from the vendor’s perspective, particularly if the data is being incorporated into tradable instruments.

“In the OTC markets, a lot of pricing is calculated,” she says. “For example, you might have activity in the short end and 2yr, 5yr and 10yr trading, but the curve is filled by interpolation on an indicative basis. If you’re using that data to create a tradeable instrument or an index or an ETF, you have to know where that data is coming from, to provide the necessary traceability. Which means that you have to be reporting that back to the data provider, so that they have the opportunity to make sure it’s fit and proper for that purpose. Often the data is not intended to be used that way, so if it is, it needs to be licensed accordingly, so that the necessary governance and control is in place.”

The issue of how wholesale data is used as a component within indices and benchmarks is a less obvious one, which the FCA hasn’t recognised or addressed, says Tobias Spröhnle, an angel investor and recognised thought leader in indices, benchmark and data products. “Data providers and exchanges have established licensing structures that are very effective to deter new competitors from entering the market. The pricing is set at such a high threshold that it becomes exceedingly difficult for newcomers to access the data they need to create new benchmarks and thus foster competition. To be a credible benchmark provider, you need access to the comprehensive index universe, be it FTSE, S&P, or others, and that’s the complex part. What many don’t realise is that to challenge the established index providers, you initially have to utilise their intellectual property. This catch-22 is something that has largely escaped the FCA’s attention,” he says.

ETFs and the illusion of choice

The FCA study estimates that index-based strategies now account for 85% of total UK-managed AuM, a significant growth in recent years, primarily driven by ETFs. And ETF providers face their own set of unique challenges, according to Rudolf Siebel, Managing Director of BVI, the German investment fund industry association.

“In the realm of index providers, choice can often be an illusion,” he says. “While selecting a specific index might seem within everyone’s control, the reality is quite different. Once you’ve committed to an index, like the MSCI World Index, which enjoys widespread usage and brand recognition, switching to a competitor like FTSE or STOXX is no small feat. It’s technically possible, but it entails revising your prospectus and informing all your investors, a process that’s both complex and expensive. You can’t just switch indices annually to sidestep price changes. Therefore, in practice, you find yourself tied to the initial provider you’ve chosen.”

Since ETFs rely heavily on automated trading, index-related expenses constitute a significant portion of their operational costs, says Siebel. “ETF providers are likely to become more discerning in their choice of index providers as they strive to maintain their position as cost-effective investment vehicles. Ultimately though, ETF providers need to be more flexible by decoupling ETF names from specific indices, which will give them more leeway to switch index providers as needed.”

A way forward?

Things are changing, albeit very slowly, says Spröhnle. “It’s encouraging to see the industry evolving in recent years,” he says. “The established benchmark providers have pushed their luck too far – rather than creating new products, they’ve been more focused on developing new licensing models to squeeze greater revenue from the same data points. But the perception of benchmarks as immovable and the high cost associated with switching them is beginning to shift, and new alternatives are emerging. However, we have to acknowledge that this shift in mindset is happening at a glacial pace.”

Siebel notes that as an industry association, BVI is actively campaigning for regulatory changes – including RCB-like measures – that would introduce safeguards and compel the disclosure of pricing lists to promote transparency. “This step alone would foster more effective negotiations over data procurement,” he says. “Currently, the vast disparities in prices are often a reflection of one’s bargaining prowess. Having a standard base price would help mitigate these disparities, bringing some uniformity to the negotiating table. If regulated data providers were required to disclose their pricing and perhaps also offer a report—similar to what exchanges do—on how their pricing aligns with production costs, market participants would be better equipped to assess the fairness of the prices and negotiate more effectively. The introduction of price caps should be a last resort, yet we shouldn’t dismiss it entirely.”

Going forward, Carrodus believes that the FCA study is bound to influence the industry. “It’s probably altered the context of discussions between consumers and vendors,” he says. “It will no doubt act as a catalyst, because people were waiting to see what would be in the report and now they know. The providers are now aware that there is more scrutiny of them than there’s ever been, and there’s a fair amount of explanation of what’s wrong. So it does change the dynamics.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Trade the Middle East & North Africa: Connectivity, Data Systems & Processes

Date: 6 November 2024 Time: 11am London / 1pm Egypt  / 2pm Saudi Arabia / 3pm United Arab Emirates / 12pm CET Duration: 50 minutes In Partnership With As key states across the region seek alternatives to the fossil fuel industries that have driven their economies for decades, pioneering financial centres are emerging in Egypt,...

BLOG

Revolutionising Trading Workflows with AI and Interoperability

Together, AI and interoperability have the power to transform trader workflows in financial markets. Driven by the need for greater efficiency and performance, the promise of AI in decision-making and trade execution is now being realised through the integration of existing systems with new AI-based capabilities. This open, interoperable approach enables seamless workflows and personalised...

EVENT

Future of Capital Markets Tech Summit: Buy AND Build, London

Buy AND Build: The Future of Capital Markets Technology London on September 19th at Marriott Hotel Canary Wharf London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Institutional Digital Assets Handbook 2024

Despite the setback of the FTX collapse, institutional interest in digital assets has grown markedly in the past 12 months, with firms of all sizes now acknowledging participation in some form. While as recently as a year ago, institutional trading firms were taking a cautious stance toward their use, the acceptance of tokenisation, stablecoins, and...