About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Consensus is the Hardest Word

Subscribe to our newsletter

If nothing else, the reference data utility panel at last week’s Xtrakter conference again proved how difficult it is to agree on the best way to proceed with data standardisation, even if the conversation about the subject has been ongoing for several years. Francis Gross, head of the external statistics division at the European Central Bank (ECB), reiterated the points he has made many times before in favour of the introduction of a “thin” reference data utility and industry participants voiced their concerns about the idea of regulatory led change in the area of data standardisation. Plus ça change…

Of course, not everyone was sceptical of the idea, David Berry, executive in charge of market data sourcing and strategy at UBS and member of the Information Providers User Group (IPUG), noted that a utility might help to force vendors to bring down their costs, especially those embedded in the data acquisition process. In his role at IPUG, Berry has been a key lobbyist to put an end to vendor pricing practices based on charging for end user licenses and he is seemingly in favour of any kind of leverage that might help in this process.

Berry elaborated on the challenges faced by his own institution in endeavouring to deal with the complexities of the current lack of standardisation in the market, which means the firm has to buy in 16 different vendor feeds. He noted that there are 58 fields within messages, for example, but only 47 of these are usually populated: “Surely there is a simpler way to do all of this?”

UBS has recently been engaged in a data quality assessment exercise to determine the total cost of ownership of its data and to remove any duplication and redundancy in order to bring these costs down. Berry therefore indicated the need for a ranking or filtering system in order to be better able to conduct exercises such as these and find the highest data accuracy for the lowest price point. He contended that a data utility might be useful in establishing a jumping off point for this process. “We need a regulator to force standards through,” he added.

But is this a good enough reason to impose a utility approach on the market? Many of the industry participant audience members I spoke to weren’t sure it was and were concerned about the progress that seems to be being made with the Office of Financial Research in the US (at this stage it seems the bill may be passed without altering this section in spite of Shelby’s concerns).

One Reference Data Review reader from the securities services practitioner world noted that the effect of imposing common standards for securities reference data will certainly be to deliver long term efficiencies, but the sheer extent of IT costs involved will mean that payback will likely take 10 to 20 years with very significant investment for the industry in the short term. He felt that the creation of a new utility would add to those costs and delays; thus adding to the current barrage of regulatory requirements on the table at the moment.

He indicated that the imposition of standards could be achieved far more quickly and effectively using existing mechanisms by mandating issuers and vendors to carry standard fields (for example, ISO fields), if necessary in addition to those they support already. “The ECB argument seems to be that the industry is incapable of agreeing common reference data standards so a new system must be built so that imposes such standards,” he noted.

As noted by other industry participants, the utility is seen by some as failing to tackle the real industry challenge of how far reference data is “hard wired” and “baked in” to the investment process. “Most of these institutions will have more than one securities master that would need to be upgraded. Each securities attribute is used to drive a downstream business process many of which would need to be re-engineered. Adding a new utility would create yet another link in this chain,” noted our reader.

He feels therefore that there is no public evidence that the lack of securities reference data standards has caused issues with transaction reporting or has led to poor investment decisions. “It is a myth that securities data, as delivered by existing methods, is of poor quality. The ECB has not delivered any specific compelling evidence of such issues. The imposition of a new warehouse could create delays in new security that would reduce trade processing STP and increase failed trades,” he contended.

Regardless of these concerns however, the US reform process is plodding on and the concept of a data utility is coming ever closer to the marker. It is likely that the politicians have no clue as to the underlying can of worms that they are beginning to crack open and the impact it could potentially have on the market (for good or bad), as well as the scale of the challenge they are taking on. I’d wager they will do soon.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Trade the Middle East & North Africa: Connectivity, Data Systems & Processes

Date: 6 November 2024 Time: 11am London / 1pm Egypt  / 2pm Saudi Arabia / 3pm United Arab Emirates / 12pm CET Duration: 50 minutes In Partnership With As key states across the region seek alternatives to the fossil fuel industries that have driven their economies for decades, pioneering financial centres are emerging in Egypt,...

BLOG

AI-Powered Innovations in Voice and ChatGPT Surveillance From Global Relay

Global Relay, a provider of compliance messaging and archiving solutions, has launched an AI-powered voice-to-text transcription service to address the growing demand for effective voice surveillance, as compliance teams face increasing challenges in monitoring a diverse array of virtual platforms.  Global Relay’s new service aims to bridge the gap in monitoring voice communications by offering...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...