The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

ICFR Meeting Calls for Increased Consistency of Supervisory Data Collection, Global Coordination

Share article

The International Centre for Financial Regulation’s (ICFR) recent meeting on macroprudential data scoping highlighted a number of concerns related to the collection of data for the monitoring of systemic risk, not least of which was the need for global coordination among regulators. Attendees to the meeting, which took place earlier this month in London under Chatham House rules, noted there is a strong argument for globally consistent data reporting formats, standard data definitions and a more established frequency of reporting in order to ease the process for financial institutions in terms of “data programming and compliance”.

The ICFR meeting attendees also noted that choices on the data required for systemic risk monitoring should be made using a risk-based methodology. This is in order to ensure that the most important data series are captured first, and the cost of collection can be justified. Some participants said it might be useful to implement a common accounting or reporting language across the regulators, although this would take an unprecedented degree of coordination. Whereas others felt that a ‘one in, one out’ rule should be used when it comes to data collection – that is, once a new method of data collection is adopted, the old one should be abandoned. The process of data collection should be more harmonised, argued these attendees, who stressed the need for comparability among different data items and across regulatory borders. This is essential in the ability to track risks related to systemically important financial institutions that operate across the globe.

To this end, regulators should use existing and new data collection bodies in order to gather these data items, such as newly established central counterparties (CCPs) within the OTC markets. The focus of the meeting was also on determining the correct level of data disclosure to adopt, taking into account privacy concerns and data protection considerations. It was agreed, for instance, that as increased data granularity is becoming more important to the regulatory community in the face of the increased complexity of the financial markets. Although some regulators advocate maximum granularity, some attendees indicated that this might create distortions in the market, since close monitoring of one financial instrument incentivises the banking industry to design a ‘new instrument’ which will not be captured by the data. On the subject of disclosure, attendees notes that a project on the optimal disclosure of information on the efficiency of price setting would be useful, as there is little academic work on this topic. For example to determine how much information can be put in the market before the price mechanism no longer functions.

The issue of the frequency of data collection was also addressed, with the general stance being that the time span of each data collection should be reduced to days or weeks, rather than quarters or years. In addition, it was proposed that clearing and registration of derivatives would naturally increase standardisation and, hence, ease supervision. All of this feedback reflects some of the discussions going on within the regulatory community with regards to the introduction of more standard data formats and identifiers in the market, for example, the US Office of Financial Research’s (OFR) proposals on legal entity identification. Hopefully, participants at the meeting will be incentivised to join the wider debate on the subject of data standards.

Related content

WEBINAR

Upcoming Webinar: Entity identification and client lifecycle management – How financial institutions can drive $4 billion in cost savings

Date: 21 January 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes A new model in Legal Entity Identifier (LEI) issuance has created significant opportunities for financial institutions to capitalise on their KYC and AML due diligence. By becoming Validation Agents and obtaining LEIs on behalf of their clients, financial institutions...

BLOG

Fenergo Launches New Client Lifecycle Management Solution

Digital transformation specialist Fenergo has streamlined its existing client lifecycle management (CLM) solution to make it more accessible for mid-tier and boutique businesses. Fen-Xcelerate, powered by Amazon Web Services (AWS) is a lower cost, cloud-based version of the firm’s flagship CLM product. Integrated with partners including Salesforce, Refinitiv’s World-Check One, RDC, and DocuSign, the solution also supports integration with...

EVENT

Data Management Summit London

Now in its 11th year, the Data Management Summit (DMS) in London, will explore how financial institutions are adapting their data strategies to capitalise and support revenue generating activities and operational efficiency in today's digital and cloud based environment. Join us to hear from leading data practitioners and innovators who will share insights into how they are pushing the boundaries with data and delivering value through data and analytics.

GUIDE

Regulatory Data Handbook 2020/2021 – Eighth Edition

This eighth edition of A-Team Group’s Regulatory Data Handbook is a ‘must-have’ for capital markets participants during this period of unprecedented change. Available free of charge, it profiles every regulation that impacts capital markets data management practices giving you: A detailed overview of each regulation with key dates, data and data management implications, links to...