About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

The IMF Stress Testing Paper and Tangible Proof

Subscribe to our newsletter

A lot of rhetoric has been bandied about of late with regards to the paramount importance of data quality to the risk management function. While these discussions are important in raising data quality’s profile and keeping the issue on the agenda, they don’t provide tangible information on the impact that poor data quality actually has on risk modelling, which is where the International Monetary Fund’s (IMF) recent research paper comes into play…

Handily entitled “Into the Great Unknown: Stress Testing with Weak Data” (which meant I noticed its potential relevance to the reference data function in the first place), the paper provides examples of how data quality impacts regulatory stress testing, which it describes as the risk tool “du jour” in the post-crisis environment. The paper will, no doubt, serve as yet more ammunition for the regulatory community to use in its overall push to improve the underlying data quality of firms’ regulatory reports.

Although the authors are aiming to highlight the difficulties faced within “lower income” countries, where the data quality issue is deemed to be much more challenging, the findings are equally relevant to all other jurisdictions engaged in stress testing practices. Indeed, data quality is a serious issue that could cause “more harm than good if the flawed findings cause undue consternation or lead to inappropriate decisions and actions” in any case. The use of hypothetical, but numerical and thus tangible, examples of stress tests carried out on the basis of poor quality data highlight this serious potential for error.

The paper demonstrates the impact of having to make assumptions around various macroeconomic variables in the case of such data being unavailable, hence the addition of “ad hoc shocks”. For example, when individual bank data are either not provided or incomplete, “back of the envelope” stress tests are performed on the system as a whole with some form of aggregated data. The findings of such tests should therefore be “interpreted with caution”, note the authors.

“The information derived from shocks to the aggregate system could mask problems among individual banks,” says the paper. “In our example, a shock representing a 400% increase in non-performing loans across the board would result in the system’s capital adequacy ratio declining by 4.7% points to 10.4%.” In this particular example, the impact would be that the overall capital adequacy ratio would be deemed to be 1.6 percentage points below the required minimum of 12%, thus resulting in the system failing the test overall.

The performance of the individual banks could be obscured completely by the use of this aggregate data. In the example, one bank’s capital adequacy ratio declines 9.3%, whereas for another the decline is a more moderate 3.4%. “Thus, focusing on the aggregate outcome alone could obscure the possibility that a particular institution may be very vulnerable with potentially systemic consequences,” note the authors.

The paper goes on to suggest alternative methods to the traditional stress test for regulators to use in the case of being unable to ensure a sufficient level of data quality within financial institutions’ reports. Regardless of these proposals however, the authors do a good job of highlighting the real dangers of forcing regulators to use workarounds due to a lack of data quality. This holds true outside of developing economies and could be used to illustrate the regulatory community’s recent crusade to improve standardisation across the markets.

Moreover, if this logic holds water on an industry-wide basis, it should also prove illustrative of the dangers for an individual firm’s own stress testing practices. Making assumptions on aggregate or inaccurate data could result in fairly serious errors in judgement.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practices for eComms and multi-channel surveillance

Surveillance of multi-channel communications is a moving target as financial institutions continue to add conversational streams, in many cases mobile applications previously banned from the trading environment. With more eComms channels comes more data that must be managed, retained, and ready for regulatory compliance. For many firms, the changing shape of surveillance is a complex...

BLOG

Xceptor’s Integration with Taskize Seeks to Aid Firms as They Grapple with T+1

With the US markets’ adoption of T+1 settlement looming, trade automation platform Xceptor has teamed up with corporate collaboration specialist Taskize to streamline the trade affirmation process. The integration of the Xceptor Confirmations solution with Taskize’s inter-company workflow platform is aimed at helping firms to orchestrate affirm, confirm, and dispute resolution with their counterparties as...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...