About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Consensus is the Hardest Word

Subscribe to our newsletter

If nothing else, the reference data utility panel at last week’s Xtrakter conference again proved how difficult it is to agree on the best way to proceed with data standardisation, even if the conversation about the subject has been ongoing for several years. Francis Gross, head of the external statistics division at the European Central Bank (ECB), reiterated the points he has made many times before in favour of the introduction of a “thin” reference data utility and industry participants voiced their concerns about the idea of regulatory led change in the area of data standardisation. Plus ça change…

Of course, not everyone was sceptical of the idea, David Berry, executive in charge of market data sourcing and strategy at UBS and member of the Information Providers User Group (IPUG), noted that a utility might help to force vendors to bring down their costs, especially those embedded in the data acquisition process. In his role at IPUG, Berry has been a key lobbyist to put an end to vendor pricing practices based on charging for end user licenses and he is seemingly in favour of any kind of leverage that might help in this process.

Berry elaborated on the challenges faced by his own institution in endeavouring to deal with the complexities of the current lack of standardisation in the market, which means the firm has to buy in 16 different vendor feeds. He noted that there are 58 fields within messages, for example, but only 47 of these are usually populated: “Surely there is a simpler way to do all of this?”

UBS has recently been engaged in a data quality assessment exercise to determine the total cost of ownership of its data and to remove any duplication and redundancy in order to bring these costs down. Berry therefore indicated the need for a ranking or filtering system in order to be better able to conduct exercises such as these and find the highest data accuracy for the lowest price point. He contended that a data utility might be useful in establishing a jumping off point for this process. “We need a regulator to force standards through,” he added.

But is this a good enough reason to impose a utility approach on the market? Many of the industry participant audience members I spoke to weren’t sure it was and were concerned about the progress that seems to be being made with the Office of Financial Research in the US (at this stage it seems the bill may be passed without altering this section in spite of Shelby’s concerns).

One Reference Data Review reader from the securities services practitioner world noted that the effect of imposing common standards for securities reference data will certainly be to deliver long term efficiencies, but the sheer extent of IT costs involved will mean that payback will likely take 10 to 20 years with very significant investment for the industry in the short term. He felt that the creation of a new utility would add to those costs and delays; thus adding to the current barrage of regulatory requirements on the table at the moment.

He indicated that the imposition of standards could be achieved far more quickly and effectively using existing mechanisms by mandating issuers and vendors to carry standard fields (for example, ISO fields), if necessary in addition to those they support already. “The ECB argument seems to be that the industry is incapable of agreeing common reference data standards so a new system must be built so that imposes such standards,” he noted.

As noted by other industry participants, the utility is seen by some as failing to tackle the real industry challenge of how far reference data is “hard wired” and “baked in” to the investment process. “Most of these institutions will have more than one securities master that would need to be upgraded. Each securities attribute is used to drive a downstream business process many of which would need to be re-engineered. Adding a new utility would create yet another link in this chain,” noted our reader.

He feels therefore that there is no public evidence that the lack of securities reference data standards has caused issues with transaction reporting or has led to poor investment decisions. “It is a myth that securities data, as delivered by existing methods, is of poor quality. The ECB has not delivered any specific compelling evidence of such issues. The imposition of a new warehouse could create delays in new security that would reduce trade processing STP and increase failed trades,” he contended.

Regardless of these concerns however, the US reform process is plodding on and the concept of a data utility is coming ever closer to the marker. It is likely that the politicians have no clue as to the underlying can of worms that they are beginning to crack open and the impact it could potentially have on the market (for good or bad), as well as the scale of the challenge they are taking on. I’d wager they will do soon.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlock the Future of European Equities & Trading Technology: 2024 and Beyond!

In a world where the financial landscape is perpetually evolving, 2023 has brought widespread discussions around liquidity, regulatory shifts in the EU and UK, and advancements like the consolidated tape in Europe. For the year ahead in 2024, the European market is poised for transformative changes that will influence the future of trading technology and...

BLOG

New DTCC Report Recommends Best Practices to Achieve T+1 Settlement Success

In anticipation of the transition to a T+1 settlement cycle in the US, the Depository Trust & Clearing Corporation (DTCC) has released a new report, “Hitting 90% Affirmation by 9:00 PM ET on Trade Date: The Key to T+1 Success”, which highlights the importance of automating post-trade processes to achieve success in the upcoming T+1...

EVENT

Buy AND Build: The Future of Capital Markets Technology, London

Buy AND Build: The Future of Capital Markets Technology London on September 19th at Marriott Hotel Canary Wharf London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...