The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

IFRS 9 Methodology and Model Management Post Pandemic

By Mahim Mehra, Senior Risk Advisor, and Yoon Sik Ma, VP, Product Manager, AxiomSL.

Complying with the credit-risk analytics and credit- impairment requirements under International Financial Reporting Standard (IFRS) 9 has forced financial institutions to change how they organize, calculate, and examine their credit data. Organizations have had to reconcile data across risk, finance, and accounting functions. Furthermore, reconciling the CFO’s profit/loss perspective with the CRO’s risk/portfolios perspective has demanded that firms articulate and reconcile their data differently.

From a risk perspective, IFRS 9 can cause provisioning assessments to be more sensitive than previously because it includes multiple instances under which expected credit losses (ECL) must be assessed for lifetime or under volatile forward-looking scenarios. ECL relies upon point-in-time (PIT) estimates for default probabilities. This contrasts with the through-the-cycle (TTC) default probabilities used for Basel-driven regulatory capital calculations under the internal ratings based (IRB) approach

Operational Challenges: Fragmented Data Architectures, Credit-Risk Modelling, And Stress Testing Processes

Banks must be able to rely on their data and credit risk models to better inform ECL and risk decision-making and for overall IFRS 9 compliance. However, to implement IFRS 9 effectively, they face a major operational challenge: to source and integrate disparate data and models from risk and finance functions that historically operate in silos. In a typically fragmented data architecture with lack of vertical integration:

  • Data emanates from multiple source systems in different formats and structures and with different degrees of granularity
  • Different systems in different silos often address different parts of the same problem differently.

This fragmentation is also evident regarding the array of credit risk models a financial institution uses to generate probability of default (PD), loss given default (LGD), and exposure at default (EAD) curves. These outputs are typically generated using black-box-like modelling systems (home-grown, Excel-based, third-party) that draw upon data from across the fragmented architecture. The various modelling system outputs then feed a calculation system where:

  • Exposures and the PIT PD, LGD, and EAD model outputs are used to calculate ECL, and
  • Scenario analysis/simulations are performed.

Typically, banks maintain multitudes of models. Risk managers in various parts of a firm manage models on a sector basis as they deal with portfolio idiosyncrasies. This leads to model proliferation – creating a model ‘muddle’ that heightens operational risks around model execution.

When an organization relies upon fragmented data and modelling processes, performing scenario analysis is yet another challenge. Changes to scenario parameters in the modelling system generate new default probabilities as inputs to ECL. This means that for each scenario, there must be a huge data movement between at least two different systems.

As a further complication, IFRS 9 processes include feeding ECL outputs to general ledger (GL) postings. Yet different systems may perform dashboarding and reporting for internal and other regulatory requirements.

From an operational point of view, this lack of coherence severely taxes dispersed systems and strains the IT function as it struggles to meet its business-as-usual (BAU) service-level agreements (SLAs) for end-to-end IFRS 9 compliance.

Interpreting results

The huge datasets involved coupled with the need to replicate data to execute multiple scenarios, model outputs, and ECL calculations means that IFRS 9 results can be difficult to interpret and defend. This reveals another risk point for financial institutions: the need to be able to attribute the variance of results across scenarios to an underlying root cause.

The difficulty arises due to the opacity of the many black-box-like modelling systems and calculation applications often present in large organizations. The activities required by a bank’s financial analysts to perform results and attribution analysis are often manually driven, resource intensive, and require more technical knowledge than might be expected. In order to obtain even the simplest results, for example, analysts might first need to delve into the system’s data schema and table structures and be capable of writing complex queries. In addition, since systems usually do not store intermediate calculation results, analysts are further hindered in effectively addressing questions from business owners and internal or external auditors.

If banks cannot easily understand how outputs were generated, they may be in a difficult position to withstand audits. Clearly, organizations operating with fragmented architectures and lacking a holistic solution for IFRS 9 data and model management will find compliance challenging even during the best of times. Therefore, it is missioncritical that the IFRS 9 solution enables diverse functional areas to share risk data, models, and stress testing processes so that the bank can reconcile the CFO’s profit/ loss perspective with the CRO’s risk/portfolios perspective – and efficiently deliver consistent, transparent results.

Current conditions

With the coronavirus scourge, the plot has thickened. Everything around IFRS 9 data and models has become much more challenging. Normal conditions have ceased. Shocking lock downs persist. Regulators and governmental bodies globally have responded with various crisis interventions including stimulus programs, special credit provisions, changes in regulatory reporting frequencies, and extensions of various Basel-related risk reporting deadlines.

Caught in a black-swan vortex, banks have found themselves between the proverbial rock and a hard place. They have had to act with urgency to bolster economic activity – extending credit while being blinded by regulatory and PIT-driven ECL models flummoxed by volatile crisis impact. They have had to manage the urgent situation, while still needing to comply with IFRS 9 and Basel-driven requirements and attempting to drive down costs during a profit crunch.

A tactical approach

As they manage through the current situation, banks are handling the impact of their miscuing PIT credit models and ECL outputs in various ways. As model outputs have become less reliable, banks are interrogating them harder and working to determine the best ways to modify or change them, asking questions like:

  • How far can plain vanilla risk analytics carry us? When faced with uncertainty of a Covid-19 magnitude, some banks are practicing a bit of model-avoidance. This may patch them over the short-term but will not serve ECL management nor IFRS 9 compliance in the longer term.
  • What are the implications of using the expert judgement approach? Instead of modifying a model (to avoid the possibility of increased model risk), using the expert judgement approach to manage what is happening immediately is another tactical option. However, it carries risk that the bank may inadvertently perpetuate a short-term perspective.
  • How best should we modify models to reflect new risk scenarios? Tactical fixes may be necessary, while recognizing they may actually increase model risk. Would a hybrid tactical/strategic approach serve banks better?
  • How can we better calibrate our models in a fast-moving environment? When the environment changes rapidly, assumptions based on macro data by definition are already behind the curve. PIT credit model inputs, such as inherently lagging GDP data, have proved useless during this crisis.
  • Could equity prices serve as a new source of high-frequency data inputs to PIT models? A high-frequency data source like equity prices could deliver much better point-in-time information. However, to validate such an input would require significant back testing.

Paying the piper

For the time being, tactical moves perhaps are keeping at least some of the challenges at bay. But this is a strategic juncture and firms need to make a change. To successfully manage through the ongoing crisis situation, and enter the post-pandemic period on solid footing, banks should take a three-pronged approach:

  • Continue with tactical and strategic approaches for recalibrating models to account for the prevailing economic climate
  • Implement an industrial strength IFRS 9 ECL calculation and reporting solution
  • Adopt and embed a strategic framework to manage data and credit risk models that will serve long-term objectives for IFRS 9 and Basel-related regulatory requirements.

Have you booked your place for the RegTech Summit Virtual 2021, coming up on May 19? Don’t miss out – make sure you secure your spot today. Click here to book.

Related content

WEBINAR

Upcoming Webinar: Getting ready for Sustainable Finance Disclosure Regulation (SFDR) and ESG – what action should asset managers be taking now?

Date: 8 June 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Interest in Environmental, Social and Governance (ESG) investment has exploded in recent years, bringing with it regulation and a requirement for buy-side firms to develop ESG strategies and meet disclosure obligations. The sell-side can help here by integrating ESG...

BLOG

The Future of Transatlantic Regulatory Cooperation: Antony Phillipson

As the RegTech Summit Virtual 2020 fast approaches, we are so pleased to welcome Antony Phillipson, Her Majesty’s Trade Commissioner for North America and Consul General in New York on Day Three (November 18) for a special Keynote speech and Q&A to discuss a possible Brexit UK-US FTA and outline the future of transatlantic regulatory...

EVENT

Data Management Summit New York City

Now in its 10th year, the Data Management Summit (DMS) in NYC explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.

GUIDE

Regulatory Data Handbook 2020/2021 – Eighth Edition

This eighth edition of A-Team Group’s Regulatory Data Handbook is a ‘must-have’ for capital markets participants during this period of unprecedented change. Available free of charge, it profiles every regulation that impacts capital markets data management practices giving you: A detailed overview of each regulation with key dates, data and data management implications, links to...