About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

The IMF Stress Testing Paper and Tangible Proof

Subscribe to our newsletter

A lot of rhetoric has been bandied about of late with regards to the paramount importance of data quality to the risk management function. While these discussions are important in raising data quality’s profile and keeping the issue on the agenda, they don’t provide tangible information on the impact that poor data quality actually has on risk modelling, which is where the International Monetary Fund’s (IMF) recent research paper comes into play…

Handily entitled “Into the Great Unknown: Stress Testing with Weak Data” (which meant I noticed its potential relevance to the reference data function in the first place), the paper provides examples of how data quality impacts regulatory stress testing, which it describes as the risk tool “du jour” in the post-crisis environment. The paper will, no doubt, serve as yet more ammunition for the regulatory community to use in its overall push to improve the underlying data quality of firms’ regulatory reports.

Although the authors are aiming to highlight the difficulties faced within “lower income” countries, where the data quality issue is deemed to be much more challenging, the findings are equally relevant to all other jurisdictions engaged in stress testing practices. Indeed, data quality is a serious issue that could cause “more harm than good if the flawed findings cause undue consternation or lead to inappropriate decisions and actions” in any case. The use of hypothetical, but numerical and thus tangible, examples of stress tests carried out on the basis of poor quality data highlight this serious potential for error.

The paper demonstrates the impact of having to make assumptions around various macroeconomic variables in the case of such data being unavailable, hence the addition of “ad hoc shocks”. For example, when individual bank data are either not provided or incomplete, “back of the envelope” stress tests are performed on the system as a whole with some form of aggregated data. The findings of such tests should therefore be “interpreted with caution”, note the authors.

“The information derived from shocks to the aggregate system could mask problems among individual banks,” says the paper. “In our example, a shock representing a 400% increase in non-performing loans across the board would result in the system’s capital adequacy ratio declining by 4.7% points to 10.4%.” In this particular example, the impact would be that the overall capital adequacy ratio would be deemed to be 1.6 percentage points below the required minimum of 12%, thus resulting in the system failing the test overall.

The performance of the individual banks could be obscured completely by the use of this aggregate data. In the example, one bank’s capital adequacy ratio declines 9.3%, whereas for another the decline is a more moderate 3.4%. “Thus, focusing on the aggregate outcome alone could obscure the possibility that a particular institution may be very vulnerable with potentially systemic consequences,” note the authors.

The paper goes on to suggest alternative methods to the traditional stress test for regulators to use in the case of being unable to ensure a sufficient level of data quality within financial institutions’ reports. Regardless of these proposals however, the authors do a good job of highlighting the real dangers of forcing regulators to use workarounds due to a lack of data quality. This holds true outside of developing economies and could be used to illustrate the regulatory community’s recent crusade to improve standardisation across the markets.

Moreover, if this logic holds water on an industry-wide basis, it should also prove illustrative of the dangers for an individual firm’s own stress testing practices. Making assumptions on aggregate or inaccurate data could result in fairly serious errors in judgement.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Addressing conduct risk: approaches to surveillance

Conduct risk in financial services is a critical area that requires vigilant monitoring and robust surveillance mechanisms. Regulatory bodies, (FCA, FINRA and others) have tightened their scrutiny and financial institutions must adopt advanced approaches to effectively manage and mitigate conduct risk. This webinar will examine the latest methodologies and technologies used to address conduct risk,...

BLOG

From Awareness to Execution: Closing the Regulatory Compliance Implementation Gap

By Jordan Domash, Founder and CEO, Responsiv. Even under a ‘regulation light’ administration, more than 1,400 new Federal executive orders, final rules, and proposed rules have emerged in 2025 alone. Layer on every potential state and international change, there’s a lot to sift through. Regulatory teams must then take these often-dense legal texts and quickly...

EVENT

TradingTech Briefing New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...