The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Should the ECB be Using Existing Data Channels and Standards for a Reference Data Utility?

Ahead of the panel discussion on the subject at next week’s Sibos conference in Hong Kong, the European Central Bank’s (ECB) proposals have provoked yet more feedback from Reference Data Review readers. Most agree that starting from scratch in building such a utility would be a difficult endeavour and that the central bank should instead look to standards, and even vendors, already in the space to provide a solution for the standardisation of reference data.

Tim May, chairman of Euroclear UK & Ireland, for one, is sceptical that a regulatory driven, build from the ground up approach would be worthwhile. “There is a lack of harmonisation across Europe in terms of how ISIN codes are issued, let alone the wider area of reference data, so it would likely be a difficult endeavour to set up a single utility,” he says. “The ECB would likely need new legislation in some countries to use the standards developed by a central utility.”

This approach is certainly something that would require a lot of backing from the regulatory community, which, as noted recently by Reference Data Review, is easier said than done. However, May suggests that the ECB could instead leverage the work that has been done in the market already, “such as the work around corporate actions standards and data sets, such as those maintained by Xtrakter”.

Another reader, who wishes to remain anonymous, agrees with the objective at the heart of the ECB’s proposals: to standardise core data, but also notes that the project should use existing infrastructure in the market. “I believe this can be done most effectively through extension of existing vendor delivery channels, without any new systems being built, and this type of change will happen if end users demand it,” he explains.

“For example if the industry could home in on say, 20 core instrument fields that could and should be standardised, and then tell all the vendors what the required standard format is and that they need to be supported, over time the vendors would have to add these new standard fields as an additional attribute. Then over a longer period of time their proprietary versions of those data fields would become obsolete. This will take years because nobody can afford special projects for this; it would be need to phased in as part of existing change projects. So it is a very long haul,” he continues.

He reckons the recent campaigns and work of industry bodies such as the EDM Council and think tank JWG-IT are incorrect in their assumption that there is a fundamental data quality problem at source. “Over many years at the sharp end of the business (as line manager who is accountable for errors) I have not observed such a problem. The real issue is that fields are recorded differently by vendors (and therefore inconsistently) rather than inaccurately. That inconsistency, as opposed to inaccuracy, is almost certainly what is contributing towards the regulatory reporting issues that are alluded to. For example I have never seen a single example of poor instrument data from vendors that has caused a trade processing error,” he elaborates.

He believes that this problem of inconsistency could be resolved but is wary of the vested interests of parties that have been vocal about their work in this area thus far. “If managed carefully this approach could enable consistent and clear regulatory reporting, linkage between existing data warehouses and, most importantly, facilitate re-duplication and increased efficiency across the industry. ECB, EDM Council and JWG-IT (or a combination thereof) are well placed to help channel industry efforts in this area. The drawback is the extent to which their resources in this area are funded by data vendors who have vested interests in keeping things nice and messy,” he says.

So, the upshot from readers seems to be for the ECB led project to use existing infrastructure and standards already in place, rather than attempting to go it alone. Those involved in development must also be wary of other parties getting involved in the process that may not have the industry’s best interests at heart. Readers are certainly already wary of industry bodies bearing gifts…

Related content

WEBINAR

Recorded Webinar: Fighting fraud and financial crime with RegTech

Financial fraud and crime continue to escalate causing significant damage to companies, countries and the global economy despite enormous efforts by firms and organisations in the financial services sector to identify and expel bad actors. As these bad actors use increasingly sophisticated techniques to break into financial institutions and extract both money and data, so...

BLOG

How to Implement Strategies, Standards and Technologies for Best Practice Customer Data Management

Getting customer data right has been a problem for financial institutions for many years, but it is beginning to ease as regulation drives data aggregation underpinned by data governance, data standards emerge, and technologies replace manual processes. Getting management buy-in to improve customer data has also moved on and become more persuasive as conversations about...

EVENT

RegTech Summit New York City

Now in its 5th year, the RegTech Summit in NYC explores how the North American financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

ESG Handbook 2021

A-Team Group’s ESG Handbook 2021 is a ‘must read’ for all capital markets participants, data vendors and solutions providers involved in Environmental, Social and Governance (ESG) investing and product development. It includes extensive coverage of all elements of ESG, from an initial definition and why ESG is important, to existing and emerging regulations, data challenges...