The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Should the ECB be Using Existing Data Channels and Standards for a Reference Data Utility?

Share article

Ahead of the panel discussion on the subject at next week’s Sibos conference in Hong Kong, the European Central Bank’s (ECB) proposals have provoked yet more feedback from Reference Data Review readers. Most agree that starting from scratch in building such a utility would be a difficult endeavour and that the central bank should instead look to standards, and even vendors, already in the space to provide a solution for the standardisation of reference data.

Tim May, chairman of Euroclear UK & Ireland, for one, is sceptical that a regulatory driven, build from the ground up approach would be worthwhile. “There is a lack of harmonisation across Europe in terms of how ISIN codes are issued, let alone the wider area of reference data, so it would likely be a difficult endeavour to set up a single utility,” he says. “The ECB would likely need new legislation in some countries to use the standards developed by a central utility.”

This approach is certainly something that would require a lot of backing from the regulatory community, which, as noted recently by Reference Data Review, is easier said than done. However, May suggests that the ECB could instead leverage the work that has been done in the market already, “such as the work around corporate actions standards and data sets, such as those maintained by Xtrakter”.

Another reader, who wishes to remain anonymous, agrees with the objective at the heart of the ECB’s proposals: to standardise core data, but also notes that the project should use existing infrastructure in the market. “I believe this can be done most effectively through extension of existing vendor delivery channels, without any new systems being built, and this type of change will happen if end users demand it,” he explains.

“For example if the industry could home in on say, 20 core instrument fields that could and should be standardised, and then tell all the vendors what the required standard format is and that they need to be supported, over time the vendors would have to add these new standard fields as an additional attribute. Then over a longer period of time their proprietary versions of those data fields would become obsolete. This will take years because nobody can afford special projects for this; it would be need to phased in as part of existing change projects. So it is a very long haul,” he continues.

He reckons the recent campaigns and work of industry bodies such as the EDM Council and think tank JWG-IT are incorrect in their assumption that there is a fundamental data quality problem at source. “Over many years at the sharp end of the business (as line manager who is accountable for errors) I have not observed such a problem. The real issue is that fields are recorded differently by vendors (and therefore inconsistently) rather than inaccurately. That inconsistency, as opposed to inaccuracy, is almost certainly what is contributing towards the regulatory reporting issues that are alluded to. For example I have never seen a single example of poor instrument data from vendors that has caused a trade processing error,” he elaborates.

He believes that this problem of inconsistency could be resolved but is wary of the vested interests of parties that have been vocal about their work in this area thus far. “If managed carefully this approach could enable consistent and clear regulatory reporting, linkage between existing data warehouses and, most importantly, facilitate re-duplication and increased efficiency across the industry. ECB, EDM Council and JWG-IT (or a combination thereof) are well placed to help channel industry efforts in this area. The drawback is the extent to which their resources in this area are funded by data vendors who have vested interests in keeping things nice and messy,” he says.

So, the upshot from readers seems to be for the ECB led project to use existing infrastructure and standards already in place, rather than attempting to go it alone. Those involved in development must also be wary of other parties getting involved in the process that may not have the industry’s best interests at heart. Readers are certainly already wary of industry bodies bearing gifts…

Related content

WEBINAR

Recorded Webinar: Last minute preparations for SFTR: What still needs to be done and are we ready?

The regulation clock is ticking. Financial firms, especially those subject to Phase I of implementation, are well aware of the impending April 2020 deadline for the Securities Financing Transactions Regulation. The question is, are they ready? Tactical, i.e painful, approaches to compliance won’t be good enough. A strategic plan of attack is necessary to combat...

BLOG

VIDEO: How To Determine The True Value of Data, with Rocky Martinez

The A-Team Group Data Management Summit is fast-approaching, and with a spotlight on how to improve operational efficiency and enhance revenue opportunities through data, identifying value is a key topic for 2020. We are delighted to welcome Rocky Martinez, Chief Technology Officer at SmartStream RDU, to give us a sneak peek into his keynote speech...

EVENT

TradingTech Summit London

The TradingTech Summit in London brings together European senior-level decision makers in trading technology, electronic execution and trading architecture to discuss how firms can use high performance technologies to optimise trading in the new regulatory environment.

GUIDE

Entity Data Management Handbook – Sixth Edition

High-profile and punitive penalties handed out to large financial institutions for non-compliance with Anti-Money Laundering (AML) and Know Your Customer (KYC) regulations have catapulted entity data management up the business agenda. So, too, have industry and government reports on the staggering sums of money laundered on a global basis. Less apparent, but equally important, are...