About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

US Regulatory Agencies Publish Op Risk Guidance Highlighting Importance of Data Inputs, Governance and Validation

Subscribe to our newsletter

The Office of the Comptroller of the Currency (OCC), along with the board of governors of the Federal Reserve System, the Federal Deposit Insurance Corporation (FDIC) and the Office of Thrift Supervision (OTS), have published their interagency guidance for operational risk as part of the US adoption of the advanced measurement approach (AMA) for risk-based capital requirements, which includes comprehensive data input details and data governance and validation recommendations. The agencies indicate that they have focused on the operational risk data input aspects of the guidelines due to the need for “a credible, transparent, systematic, and verifiable approach for weighting” these inputs.

The interagency guidance therefore discusses certain common implementation issues, challenges and key considerations for addressing the new AMA framework. To this end, it focuses on the combination and use of the four required AMA data elements: internal operational loss event data, external operational loss event data, business environment and internal control factors, and scenario analysis. There is a particular stress on the data elements of this mix (in keeping with the current regulatory focus on data quality) and the regulators provide key metrics for data governance and validation.

As noted in the guidance paper: “The advanced approaches rule requires that a bank establish an operational risk management function (ORMF) that is responsible for the design, implementation, and oversight of the bank’s AMA framework, including operational risk data and assessment systems and operational risk quantification systems and related processes. The ORMF must be independent of business line management. The ORMF also should have an organisational stature commensurate with the bank’s operational risk profile.”

The internal and external data required under the guidance includes: gross operational loss amounts, dates, recoveries and relevant causal information for operational loss events occurring at the bank and outside the bank (for external data), all of which must be available for a period of at least five years. Given the granular nature of some of this data, the upshot of the guidance is that firms need to boost their risk related underlying data management systems in order to be able to store and report out this data in a consumable format. The paper also warns that those firms that fail to provide this data will receive special attention: “The agencies will scrutinise cases in which a bank excludes internal data from the estimation of operational risk severity, particularly the exclusion of tail events.”

In terms of external data, the regulators are looking for documentation why certain sources of data have been used and how they are incorporated into the internal risk system as a whole; filtering, validation and scrubbing of this data is required. An audit trail for this external data is therefore particularly important to prove that certain data governance minimum requirements have been met.

In terms of risk systems benchmarking, validation and testing, the guidance indicates that records of these processes should also be kept in good order for reporting processes if and when required. Internal audits are stressed by the agencies as a key tool in this process.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking Transparency in Private Markets: Data-Driven Strategies in Asset Management

As asset managers continue to increase their allocations in private assets, the demand for greater transparency, risk oversight, and operational efficiency is growing rapidly. Managing private markets data presents its own set of unique challenges due to a lack of transparency, disparate sources and lack of standardization. Without reliable access, your firm may face inefficiencies,...

BLOG

Validating GenAI Models in Finance: A Q&A with Chandrakant Maheshwari on Risk, Governance, and the Rise of Agentic AI

At a recent RegTech Insight Advisory Board session, a discussion with Chandrakant on generative AI (GenAI) and model risk management underscored the need to cut through the hype and myths around GenAI and emerging agentic AI in regulated markets. This Q&A is the result. It examines why traditional model validation techniques—ROC curves and confusion matrices—can’t...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...