About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

The IMF Stress Testing Paper and Tangible Proof

Subscribe to our newsletter

A lot of rhetoric has been bandied about of late with regards to the paramount importance of data quality to the risk management function. While these discussions are important in raising data quality’s profile and keeping the issue on the agenda, they don’t provide tangible information on the impact that poor data quality actually has on risk modelling, which is where the International Monetary Fund’s (IMF) recent research paper comes into play…

Handily entitled “Into the Great Unknown: Stress Testing with Weak Data” (which meant I noticed its potential relevance to the reference data function in the first place), the paper provides examples of how data quality impacts regulatory stress testing, which it describes as the risk tool “du jour” in the post-crisis environment. The paper will, no doubt, serve as yet more ammunition for the regulatory community to use in its overall push to improve the underlying data quality of firms’ regulatory reports.

Although the authors are aiming to highlight the difficulties faced within “lower income” countries, where the data quality issue is deemed to be much more challenging, the findings are equally relevant to all other jurisdictions engaged in stress testing practices. Indeed, data quality is a serious issue that could cause “more harm than good if the flawed findings cause undue consternation or lead to inappropriate decisions and actions” in any case. The use of hypothetical, but numerical and thus tangible, examples of stress tests carried out on the basis of poor quality data highlight this serious potential for error.

The paper demonstrates the impact of having to make assumptions around various macroeconomic variables in the case of such data being unavailable, hence the addition of “ad hoc shocks”. For example, when individual bank data are either not provided or incomplete, “back of the envelope” stress tests are performed on the system as a whole with some form of aggregated data. The findings of such tests should therefore be “interpreted with caution”, note the authors.

“The information derived from shocks to the aggregate system could mask problems among individual banks,” says the paper. “In our example, a shock representing a 400% increase in non-performing loans across the board would result in the system’s capital adequacy ratio declining by 4.7% points to 10.4%.” In this particular example, the impact would be that the overall capital adequacy ratio would be deemed to be 1.6 percentage points below the required minimum of 12%, thus resulting in the system failing the test overall.

The performance of the individual banks could be obscured completely by the use of this aggregate data. In the example, one bank’s capital adequacy ratio declines 9.3%, whereas for another the decline is a more moderate 3.4%. “Thus, focusing on the aggregate outcome alone could obscure the possibility that a particular institution may be very vulnerable with potentially systemic consequences,” note the authors.

The paper goes on to suggest alternative methods to the traditional stress test for regulators to use in the case of being unable to ensure a sufficient level of data quality within financial institutions’ reports. Regardless of these proposals however, the authors do a good job of highlighting the real dangers of forcing regulators to use workarounds due to a lack of data quality. This holds true outside of developing economies and could be used to illustrate the regulatory community’s recent crusade to improve standardisation across the markets.

Moreover, if this logic holds water on an industry-wide basis, it should also prove illustrative of the dangers for an individual firm’s own stress testing practices. Making assumptions on aggregate or inaccurate data could result in fairly serious errors in judgement.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to ensure employees meet fit and proper requirements under global accountability regimes

Fitness and proprietary requirements for employees of financial institutions are not an option, but a regulatory obligation that calls on employers to regularly assess employees’ honesty, integrity and reputation, competence and capability, and financial soundness. In the UK, these requirements are a core element of the Senior Managers and Certification Regime (SMCR). They are also...

BLOG

ESG Data Tops Executives’ 2025 Shopping Lists

Senior executives at financial institutions expect to direct the biggest boost in their data expenditure plans over the coming year towards ESG information, according to a survey that also found that high-quality data and analytics in all domains is being prioritised for growth. In its third annual Future of Finance survey, Switzerland-based exchange operator SIX also found...

EVENT

RegTech Summit London

Now in its 8th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...