The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

ECB’s Utility Has a Place But Not Before Standardisation, Says BarCap’s Nagle

The old adage of not putting the cart before the horse was pertinent this week, during discussions at the Marcus Evans reference data conference on the European Central Bank’s (ECB) proposed reference data utility. The concept has been a topic of significant debate for some time now and panellists including Llew Nagle, who is in charge of global reference data change and data quality management at Barclays Capital, agreed that a utility should not be established before basic reference data standards are agreed upon within the industry.

“The utility has a place in the market but not before the industry has defined the standards on which it should be based,” said Nagle. These sentiments echo some of those discussed by Reference Data Review readers back in September last year. A number of readers indicated that the ECB should look to existing standards and, potentially, data channels and vendors to provide a solution rather than building a whole new utility.

Julia Sutton, Royal Bank of Canada’s (RBC) global head of reference data, echoed Nagle’s statement: “It is difficult to rally around a flag that hasn’t been designed yet. But there is a genuine desire for more standardisation in the market and this is a huge step forward from where we were before.”

Sutton pointed to the recent work of the Customer Data Management Group (CDMG) as an example of data collaboration in action, albeit at a high level. “Collaboration is key for the future of our industry and we need to stop being myopic about focusing only on our internal data management systems. It is our problem in the end and we need to be part of the solution.”

Other panellists, however, were not so sure about whether the industry is capable of coming up with the standards on its own. Nagle’s colleague who is vice president in charge of client data at BarCap, David Thomas, expressed his concern that the industry may need to follow rather than lead in the standardisation charge. Olivier Rose, head of projects and international market data management at Société Générale Securities Services (SGSS), agreed: “If we look at ISIN codes, for example, the industry has been unable to agree on their implementation globally and this can be seen as proof that the industry is unable to impose standards itself.”

Thomas suggested that it would be useful to see how the regulators themselves tackle the issue of data tracking internally to learn from their experience. “It would definitely need to be simple at the outset and deal with a basic level of data. Standards would need to be defined for the minimum number of fields, maybe two or three such as unique identifiers for clients and address formats,” he added.

Audience members suggested that rather than a regulatory driven approach maybe a new standards body should be established to lead these efforts and that a strong leadership figure should be enlisted from the practitioner universe to head it. However, when asked who would be appropriate for such a job, names were not forthcoming, although it was agreed that such a role would need to become someone’s day job.

Regardless of how the industry gets there, once these standards have been decided upon, panellists and audience members alike agreed that a “regulatory stick” will be needed to provide the weight behind their implementation.

It’s a good thing then, that Francis Gross, head of the external statistics division of the ECB, was at hand to confirm that the issue has not fallen off the agenda at the European level. Gross indicated that the ECB is “climbing the stairway to action” and is hoping that the discussions in the US by Fed chairman Daniel Tarullo will help to kick start some of the momentum in Europe.

Gross referred to the inclusion of the proposals to set up a National Institute of Finance (NIF), which would establish and maintain a national repository of financial transaction and entity position data, in a recent Senate bill as proof that a utility model is well on its way. “It is certainly on the agenda in the US but I also realise it will have to fight to be acknowledged along with the long list of other items for reform in the current market,” he added.

Related content

WEBINAR

Recorded Webinar: Managing LIBOR transition

The clock is ticking on the decommissioning of LIBOR towards the end of this year, leaving financial institutions with little time to identify where LIBOR is embedded in their processes, assess alternative benchmarks and reference rates, and ensure a smooth transition. Some will be ahead of others, but the challenges are significant for all as...

BLOG

Preparing for IFPR: The Time to Act is Now

By Kwame Frimpong, Regulatory Product Manager at Wolters Kluwer FRR. With the implementation of the new Investment Firms Prudential Regime (IFPR), the FCA is aiming to streamline and simplify the prudential requirements for solo-regulated investment firms in the UK. While key details of the new rules remain to be finalized, panellists joining me on a...

EVENT

RegTech Summit APAC Virtual

RegTech Summit APAC will explore the current regulatory environment in Asia Pacific, the impact of COVID on the RegTech industry and the extent to which the pandemic has acted a catalyst for RegTech adoption in financial markets.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...