About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Consensus is the Hardest Word

Subscribe to our newsletter

If nothing else, the reference data utility panel at last week’s Xtrakter conference again proved how difficult it is to agree on the best way to proceed with data standardisation, even if the conversation about the subject has been ongoing for several years. Francis Gross, head of the external statistics division at the European Central Bank (ECB), reiterated the points he has made many times before in favour of the introduction of a “thin” reference data utility and industry participants voiced their concerns about the idea of regulatory led change in the area of data standardisation. Plus ça change…

Of course, not everyone was sceptical of the idea, David Berry, executive in charge of market data sourcing and strategy at UBS and member of the Information Providers User Group (IPUG), noted that a utility might help to force vendors to bring down their costs, especially those embedded in the data acquisition process. In his role at IPUG, Berry has been a key lobbyist to put an end to vendor pricing practices based on charging for end user licenses and he is seemingly in favour of any kind of leverage that might help in this process.

Berry elaborated on the challenges faced by his own institution in endeavouring to deal with the complexities of the current lack of standardisation in the market, which means the firm has to buy in 16 different vendor feeds. He noted that there are 58 fields within messages, for example, but only 47 of these are usually populated: “Surely there is a simpler way to do all of this?”

UBS has recently been engaged in a data quality assessment exercise to determine the total cost of ownership of its data and to remove any duplication and redundancy in order to bring these costs down. Berry therefore indicated the need for a ranking or filtering system in order to be better able to conduct exercises such as these and find the highest data accuracy for the lowest price point. He contended that a data utility might be useful in establishing a jumping off point for this process. “We need a regulator to force standards through,” he added.

But is this a good enough reason to impose a utility approach on the market? Many of the industry participant audience members I spoke to weren’t sure it was and were concerned about the progress that seems to be being made with the Office of Financial Research in the US (at this stage it seems the bill may be passed without altering this section in spite of Shelby’s concerns).

One Reference Data Review reader from the securities services practitioner world noted that the effect of imposing common standards for securities reference data will certainly be to deliver long term efficiencies, but the sheer extent of IT costs involved will mean that payback will likely take 10 to 20 years with very significant investment for the industry in the short term. He felt that the creation of a new utility would add to those costs and delays; thus adding to the current barrage of regulatory requirements on the table at the moment.

He indicated that the imposition of standards could be achieved far more quickly and effectively using existing mechanisms by mandating issuers and vendors to carry standard fields (for example, ISO fields), if necessary in addition to those they support already. “The ECB argument seems to be that the industry is incapable of agreeing common reference data standards so a new system must be built so that imposes such standards,” he noted.

As noted by other industry participants, the utility is seen by some as failing to tackle the real industry challenge of how far reference data is “hard wired” and “baked in” to the investment process. “Most of these institutions will have more than one securities master that would need to be upgraded. Each securities attribute is used to drive a downstream business process many of which would need to be re-engineered. Adding a new utility would create yet another link in this chain,” noted our reader.

He feels therefore that there is no public evidence that the lack of securities reference data standards has caused issues with transaction reporting or has led to poor investment decisions. “It is a myth that securities data, as delivered by existing methods, is of poor quality. The ECB has not delivered any specific compelling evidence of such issues. The imposition of a new warehouse could create delays in new security that would reduce trade processing STP and increase failed trades,” he contended.

Regardless of these concerns however, the US reform process is plodding on and the concept of a data utility is coming ever closer to the marker. It is likely that the politicians have no clue as to the underlying can of worms that they are beginning to crack open and the impact it could potentially have on the market (for good or bad), as well as the scale of the challenge they are taking on. I’d wager they will do soon.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The future of market data – Harnessing cloud and AI for market data distribution and consumption

Market data is the lifeblood of trading, but as data volumes grow and real-time demands increase, traditional approaches to distribution and consumption are being pushed to their limits. Cloud technology and AI-driven solutions are rapidly transforming how financial institutions manage, process, and extract value from market data, offering greater scalability, efficiency, and intelligence. This webinar,...

BLOG

Navigating the Future of Regulatory Reporting: Insights from Industry Leaders

The landscape of regulatory reporting in capital markets is in constant flux, shaped by technological change, evolving compliance demands, and industry cost pressures. A recent webinar, Best Approaches for Trade and Transaction Reporting, hosted by A-Team Group in September 2025 and sponsored by the Derivatives Service Bureau, brought together senior RegTech executives and compliance practitioners...

EVENT

AI in Data Management Summit New York City

Following the success of the 15th Data Management Summit NYC, A-Team Group are excited to announce our new event: AI in Data Management Summit NYC!

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...