About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

US Regulatory Agencies Publish Op Risk Guidance Highlighting Importance of Data Inputs, Governance and Validation

Subscribe to our newsletter

The Office of the Comptroller of the Currency (OCC), along with the board of governors of the Federal Reserve System, the Federal Deposit Insurance Corporation (FDIC) and the Office of Thrift Supervision (OTS), have published their interagency guidance for operational risk as part of the US adoption of the advanced measurement approach (AMA) for risk-based capital requirements, which includes comprehensive data input details and data governance and validation recommendations. The agencies indicate that they have focused on the operational risk data input aspects of the guidelines due to the need for “a credible, transparent, systematic, and verifiable approach for weighting” these inputs.

The interagency guidance therefore discusses certain common implementation issues, challenges and key considerations for addressing the new AMA framework. To this end, it focuses on the combination and use of the four required AMA data elements: internal operational loss event data, external operational loss event data, business environment and internal control factors, and scenario analysis. There is a particular stress on the data elements of this mix (in keeping with the current regulatory focus on data quality) and the regulators provide key metrics for data governance and validation.

As noted in the guidance paper: “The advanced approaches rule requires that a bank establish an operational risk management function (ORMF) that is responsible for the design, implementation, and oversight of the bank’s AMA framework, including operational risk data and assessment systems and operational risk quantification systems and related processes. The ORMF must be independent of business line management. The ORMF also should have an organisational stature commensurate with the bank’s operational risk profile.”

The internal and external data required under the guidance includes: gross operational loss amounts, dates, recoveries and relevant causal information for operational loss events occurring at the bank and outside the bank (for external data), all of which must be available for a period of at least five years. Given the granular nature of some of this data, the upshot of the guidance is that firms need to boost their risk related underlying data management systems in order to be able to store and report out this data in a consumable format. The paper also warns that those firms that fail to provide this data will receive special attention: “The agencies will scrutinise cases in which a bank excludes internal data from the estimation of operational risk severity, particularly the exclusion of tail events.”

In terms of external data, the regulators are looking for documentation why certain sources of data have been used and how they are incorporated into the internal risk system as a whole; filtering, validation and scrubbing of this data is required. An audit trail for this external data is therefore particularly important to prove that certain data governance minimum requirements have been met.

In terms of risk systems benchmarking, validation and testing, the guidance indicates that records of these processes should also be kept in good order for reporting processes if and when required. Internal audits are stressed by the agencies as a key tool in this process.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unpacking Stablecoin Challenges for Financial Institutions

The stablecoin market is experiencing unprecedented growth, driven by emerging regulatory clarity, technological maturity, and rising global demand for a faster, more secure financial infrastructure. But with opportunity comes complexity, and a host of challenges that financial institutions need to address before they can unlock the promise of a more streamlined financial transaction ecosystem. These...

BLOG

From Validation to Intelligence: How n-Tier is Redefining Regulatory Reporting at Scale

As regulatory reporting matures into a data-driven discipline, n-Tier has emerged as one of the few technology firms able to bridge legacy fragmentation and the next generation of granular, real-time oversight. Speaking from n-Tier’s headquarters, Founder and Chief Executive Officer Peter Gargone describes a market reshaping around scale, consolidation and continuous validation – and a...

EVENT

AI in Data Management Summit New York City

Following the success of the 15th Data Management Summit NYC, A-Team Group are excited to announce our new event: AI in Data Management Summit NYC!

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...