About a-team Marketing Services

A-Team Insight Blogs

ICFR Meeting Calls for Increased Consistency of Supervisory Data Collection, Global Coordination

Subscribe to our newsletter

The International Centre for Financial Regulation’s (ICFR) recent meeting on macroprudential data scoping highlighted a number of concerns related to the collection of data for the monitoring of systemic risk, not least of which was the need for global coordination among regulators. Attendees to the meeting, which took place earlier this month in London under Chatham House rules, noted there is a strong argument for globally consistent data reporting formats, standard data definitions and a more established frequency of reporting in order to ease the process for financial institutions in terms of “data programming and compliance”.

The ICFR meeting attendees also noted that choices on the data required for systemic risk monitoring should be made using a risk-based methodology. This is in order to ensure that the most important data series are captured first, and the cost of collection can be justified. Some participants said it might be useful to implement a common accounting or reporting language across the regulators, although this would take an unprecedented degree of coordination. Whereas others felt that a ‘one in, one out’ rule should be used when it comes to data collection – that is, once a new method of data collection is adopted, the old one should be abandoned. The process of data collection should be more harmonised, argued these attendees, who stressed the need for comparability among different data items and across regulatory borders. This is essential in the ability to track risks related to systemically important financial institutions that operate across the globe.

To this end, regulators should use existing and new data collection bodies in order to gather these data items, such as newly established central counterparties (CCPs) within the OTC markets. The focus of the meeting was also on determining the correct level of data disclosure to adopt, taking into account privacy concerns and data protection considerations. It was agreed, for instance, that as increased data granularity is becoming more important to the regulatory community in the face of the increased complexity of the financial markets. Although some regulators advocate maximum granularity, some attendees indicated that this might create distortions in the market, since close monitoring of one financial instrument incentivises the banking industry to design a ‘new instrument’ which will not be captured by the data. On the subject of disclosure, attendees notes that a project on the optimal disclosure of information on the efficiency of price setting would be useful, as there is little academic work on this topic. For example to determine how much information can be put in the market before the price mechanism no longer functions.

The issue of the frequency of data collection was also addressed, with the general stance being that the time span of each data collection should be reduced to days or weeks, rather than quarters or years. In addition, it was proposed that clearing and registration of derivatives would naturally increase standardisation and, hence, ease supervision. All of this feedback reflects some of the discussions going on within the regulatory community with regards to the introduction of more standard data formats and identifiers in the market, for example, the US Office of Financial Research’s (OFR) proposals on legal entity identification. Hopefully, participants at the meeting will be incentivised to join the wider debate on the subject of data standards.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: In data we trust – How to ensure high quality data to power AI

Artificial intelligence is increasingly powering financial institutions’ processes and workflows, encompassing all parts of the enterprise from front-office to the back-office. As organisations seek to gain a competitive edge, they are trialling the technology in variety of ways to streamline and empower multiple use cases. Some are further than others along the path to achieving...

BLOG

Webinar Preview: Buy-Side Best Practices to Navigate a Challenging Data Landscape

The buy side is facing fresh challenges as rapid digitalisation, shifting geopolitics and economic uncertainty force investors to evolve their investment strategies. While firms are turning to data to make sense of, and react to, these new volatilities, that surge of information is posing its own set of challenges too, particularly on how to manage...

EVENT

TradingTech Summit London

Now in its 14th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...