About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

The IMF Stress Testing Paper and Tangible Proof

Subscribe to our newsletter

A lot of rhetoric has been bandied about of late with regards to the paramount importance of data quality to the risk management function. While these discussions are important in raising data quality’s profile and keeping the issue on the agenda, they don’t provide tangible information on the impact that poor data quality actually has on risk modelling, which is where the International Monetary Fund’s (IMF) recent research paper comes into play…

Handily entitled “Into the Great Unknown: Stress Testing with Weak Data” (which meant I noticed its potential relevance to the reference data function in the first place), the paper provides examples of how data quality impacts regulatory stress testing, which it describes as the risk tool “du jour” in the post-crisis environment. The paper will, no doubt, serve as yet more ammunition for the regulatory community to use in its overall push to improve the underlying data quality of firms’ regulatory reports.

Although the authors are aiming to highlight the difficulties faced within “lower income” countries, where the data quality issue is deemed to be much more challenging, the findings are equally relevant to all other jurisdictions engaged in stress testing practices. Indeed, data quality is a serious issue that could cause “more harm than good if the flawed findings cause undue consternation or lead to inappropriate decisions and actions” in any case. The use of hypothetical, but numerical and thus tangible, examples of stress tests carried out on the basis of poor quality data highlight this serious potential for error.

The paper demonstrates the impact of having to make assumptions around various macroeconomic variables in the case of such data being unavailable, hence the addition of “ad hoc shocks”. For example, when individual bank data are either not provided or incomplete, “back of the envelope” stress tests are performed on the system as a whole with some form of aggregated data. The findings of such tests should therefore be “interpreted with caution”, note the authors.

“The information derived from shocks to the aggregate system could mask problems among individual banks,” says the paper. “In our example, a shock representing a 400% increase in non-performing loans across the board would result in the system’s capital adequacy ratio declining by 4.7% points to 10.4%.” In this particular example, the impact would be that the overall capital adequacy ratio would be deemed to be 1.6 percentage points below the required minimum of 12%, thus resulting in the system failing the test overall.

The performance of the individual banks could be obscured completely by the use of this aggregate data. In the example, one bank’s capital adequacy ratio declines 9.3%, whereas for another the decline is a more moderate 3.4%. “Thus, focusing on the aggregate outcome alone could obscure the possibility that a particular institution may be very vulnerable with potentially systemic consequences,” note the authors.

The paper goes on to suggest alternative methods to the traditional stress test for regulators to use in the case of being unable to ensure a sufficient level of data quality within financial institutions’ reports. Regardless of these proposals however, the authors do a good job of highlighting the real dangers of forcing regulators to use workarounds due to a lack of data quality. This holds true outside of developing economies and could be used to illustrate the regulatory community’s recent crusade to improve standardisation across the markets.

Moreover, if this logic holds water on an industry-wide basis, it should also prove illustrative of the dangers for an individual firm’s own stress testing practices. Making assumptions on aggregate or inaccurate data could result in fairly serious errors in judgement.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: New solutions to the old problems of compliance with communications surveillance regulation

Communications surveillance is an integral element of trading at financial institutions, and its functions are clearly set out in jurisdictional regulations – to capture, record and retain all communications. Essentially, all business related communications must be recorded whatever the underlying mechanism – be it a work phone, personal mobile phone, text, video and so on...

BLOG

Predictions 2023: A Year of Evolution and Adaptation

By Ludovic Blanquet, Chief Strategy & Transformation Officer, Xceptor. As the curtains begin to draw on 2022, financial institutions are anxiously navigating the challenges imposed by war in Ukraine, rampant inflation, and looming recessions. If it was hoped the diminishing threat of Covid would offer some respite, that hope was short-lived. However, advancements were made...

EVENT

TradingTech Summit London

Now in its 13th year the TradingTech Summit London brings together the European trading technology capital markets industry, to explore how trading firms are innovating in today’s cloud and digital based environment to create flexible, scalable trading platforms to support speed to market and business agility.

GUIDE

ESG Handbook 2023

The ESG Handbook 2023 edition is the essential guide to everything you need to know about ESG and how to manage requirements if you work in financial data and technology. Download your free copy to understand: What ESG Covers: The scope and definition of ESG Regulations: The evolution of global regulations, especially in the UK...