About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Reference Data Rather Than Alcohol?!

Subscribe to our newsletter

Yesterday’s panel session on a reference data utility was surprisingly well attended, given the fact it was last thing on the first day of the conference and drinks were being served outside on the exhibition floor. Testament to the importance of the subject in an environment where everybody is waiting for the regulator’s hammer to fall, delegates listened attentively to panellists debating the practicalities of establishing such a utility for instrument and entity data.

Reprising his usual argument for a “thin” utility, the European Central Bank’s head of external statistics Francis Gross championed the cause alongside EDM Council managing director Mike Atkin. Both discussed the potential of a role for Swift and the ISO process in the endeavour, as a market neutral body that could represent its industry demographic. Perhaps next year, as well as having a main session on reference data, Swift will also have a speaker on the panel itself? Given its push in the legal entity identification space with the Bank Identifier Code (BIC), such representation would make sense.

Keen to distance these efforts from the failure of the International Business Entity Identifier (IBEI) several years ago, Atkin contended that a regulatory mandate would overrule the need for a business case to convince the industry to adopt entity identification standards.

However, Northern Trust’s senior vice president of operations and technology Kay Vicino warned that such regulation could also pose a potential threat to the industry if the impacts are not carefully considered beforehand. “Financial institutions really need to get engaged in the process in order to feed back to the regulators and ensure there is some level of clarity around exactly what is required in terms of reporting to the Office of Financial Research, for example,” she said.

Vicino stressed the potential costs of mapping to a whole host of new regulatory imposed entity and instrument identifiers. “We absolutely do not want to have lots of different reference data utilities to report to and pull data from, there should only be one. If there is more than one, the costs could be unsustainable,” she said.

But this also raises a contentious issue: where should such a utility be based? Would Europe be happy with a US-based utility? What are the legal implications of data transfer across jurisdictions?

Unsurprisingly therefore, issues such as data privacy and liability cropped up during the debate. Meredith Gibson, Citi director and legal counsel, suggested that data sharing would open up a legislative can of worms if these issues were not dealt with at the outset.

The topic of systemic risk itself and the seemingly fluid definition of what it constitutes is another point of concern for the industry, agreed the practitioner panellists. The degree of uncertainty with regards to regulation around monitoring systemic risk is tied up with the concept of a utility in many minds and firms are rightly concerned about the future. This adds the structure and operation of such a utility to the long list of regulatory unknowns in the pipeline.

Vicino elaborated upon a wish list for such a utility for the regulator to deliver on: “It needs to provide global consistency and clarity. It needs to be pragmatic, so that the industry does not have to do a lot of duplicative work. Regulators also need to engage the industry at an early stage to understand the full impact and requirements.”

SIX Telekurs’ head of marketing Ivo Bieri added that, potentially, a vendor could be the facilitator of such a utility. Given DTCC is already rumoured to be in talks with the powers that be about the Office of Financial Research, he may just be right.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practices for creating an effective data quality control framework

Data quality is critical to capital markets processes from identifying counterparties to building customer relationships, regulatory reporting, and ultimately improving the bottom line. It can also be extremely difficult to achieve. One solution is a data quality control framework that includes an automated and systematic process that monitors the state of data quality and ensures...

BLOG

Fenergo Perpetual KYC Service Offers Operational Efficiencies, Reduced Costs and Risk

Fenergo has joined early providers of perpetual KYC services with Smart Review, a solution designed to enable financial institutions to streamline periodic KYC review processes and reduce costs through automation. Smart Review automates the continuous monitoring of client profiles for KYC compliance by identifying all changes to relevant entity data, transactions and anti-money laundering (AML)...

EVENT

RegTech Summit New York

Now in its 6th year, the RegTech Summit in New York will bring together the regtech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2022/2023 – Tenth Edition

Welcome to the tenth edition of A-Team Group’s Regulatory Data Handbook, a publication that has tracked new regulations, amendments, implementation and data management requirements as regulatory change has impacted global capital markets participants over the past 10 years. This edition of the handbook includes new regulations and highlights some of the major regulatory interventions challenging...