About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Consensus is the Hardest Word

Subscribe to our newsletter

If nothing else, the reference data utility panel at last week’s Xtrakter conference again proved how difficult it is to agree on the best way to proceed with data standardisation, even if the conversation about the subject has been ongoing for several years. Francis Gross, head of the external statistics division at the European Central Bank (ECB), reiterated the points he has made many times before in favour of the introduction of a “thin” reference data utility and industry participants voiced their concerns about the idea of regulatory led change in the area of data standardisation. Plus ça change…

Of course, not everyone was sceptical of the idea, David Berry, executive in charge of market data sourcing and strategy at UBS and member of the Information Providers User Group (IPUG), noted that a utility might help to force vendors to bring down their costs, especially those embedded in the data acquisition process. In his role at IPUG, Berry has been a key lobbyist to put an end to vendor pricing practices based on charging for end user licenses and he is seemingly in favour of any kind of leverage that might help in this process.

Berry elaborated on the challenges faced by his own institution in endeavouring to deal with the complexities of the current lack of standardisation in the market, which means the firm has to buy in 16 different vendor feeds. He noted that there are 58 fields within messages, for example, but only 47 of these are usually populated: “Surely there is a simpler way to do all of this?”

UBS has recently been engaged in a data quality assessment exercise to determine the total cost of ownership of its data and to remove any duplication and redundancy in order to bring these costs down. Berry therefore indicated the need for a ranking or filtering system in order to be better able to conduct exercises such as these and find the highest data accuracy for the lowest price point. He contended that a data utility might be useful in establishing a jumping off point for this process. “We need a regulator to force standards through,” he added.

But is this a good enough reason to impose a utility approach on the market? Many of the industry participant audience members I spoke to weren’t sure it was and were concerned about the progress that seems to be being made with the Office of Financial Research in the US (at this stage it seems the bill may be passed without altering this section in spite of Shelby’s concerns).

One Reference Data Review reader from the securities services practitioner world noted that the effect of imposing common standards for securities reference data will certainly be to deliver long term efficiencies, but the sheer extent of IT costs involved will mean that payback will likely take 10 to 20 years with very significant investment for the industry in the short term. He felt that the creation of a new utility would add to those costs and delays; thus adding to the current barrage of regulatory requirements on the table at the moment.

He indicated that the imposition of standards could be achieved far more quickly and effectively using existing mechanisms by mandating issuers and vendors to carry standard fields (for example, ISO fields), if necessary in addition to those they support already. “The ECB argument seems to be that the industry is incapable of agreeing common reference data standards so a new system must be built so that imposes such standards,” he noted.

As noted by other industry participants, the utility is seen by some as failing to tackle the real industry challenge of how far reference data is “hard wired” and “baked in” to the investment process. “Most of these institutions will have more than one securities master that would need to be upgraded. Each securities attribute is used to drive a downstream business process many of which would need to be re-engineered. Adding a new utility would create yet another link in this chain,” noted our reader.

He feels therefore that there is no public evidence that the lack of securities reference data standards has caused issues with transaction reporting or has led to poor investment decisions. “It is a myth that securities data, as delivered by existing methods, is of poor quality. The ECB has not delivered any specific compelling evidence of such issues. The imposition of a new warehouse could create delays in new security that would reduce trade processing STP and increase failed trades,” he contended.

Regardless of these concerns however, the US reform process is plodding on and the concept of a data utility is coming ever closer to the marker. It is likely that the politicians have no clue as to the underlying can of worms that they are beginning to crack open and the impact it could potentially have on the market (for good or bad), as well as the scale of the challenge they are taking on. I’d wager they will do soon.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Market Abuse: Trends and Issues to Watch Out for in 2023

Date: 2 March 2023 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes With the FCA taking a supervisory approach to regulating firms, many regulatory interventions won’t end up in the public domain. What are they looking for and how do you ensure you stay on the right side of their supervision?...

BLOG

Preparing for a Period of Regulatory Change

Murray Campbell, Business Consultant at AutoRek, considers how the compliance regime for UK financial services firms is changing and how outsourcing can help firms manage the regulatory burden. In recent years, the UK financial services industry has found itself with an opportunity to redefine the compliance landscape. Brexit has allowed the UK to break away...

EVENT

TradingTech Summit London

Now in its 12th year the TradingTech Summit London brings together the European trading technology capital markets industry, to explore how trading firms are innovating in today’s cloud and digital based environment to create flexible, scalable trading platforms to support speed to market and business agility.

GUIDE

Regulatory Data Handbook 2022/2023 – Tenth Edition

Welcome to the tenth edition of A-Team Group’s Regulatory Data Handbook, a publication that has tracked new regulations, amendments, implementation and data management requirements as regulatory change has impacted global capital markets participants over the past 10 years. This edition of the handbook includes new regulations and highlights some of the major regulatory interventions challenging...