About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

US Regulatory Agencies Publish Op Risk Guidance Highlighting Importance of Data Inputs, Governance and Validation

Subscribe to our newsletter

The Office of the Comptroller of the Currency (OCC), along with the board of governors of the Federal Reserve System, the Federal Deposit Insurance Corporation (FDIC) and the Office of Thrift Supervision (OTS), have published their interagency guidance for operational risk as part of the US adoption of the advanced measurement approach (AMA) for risk-based capital requirements, which includes comprehensive data input details and data governance and validation recommendations. The agencies indicate that they have focused on the operational risk data input aspects of the guidelines due to the need for “a credible, transparent, systematic, and verifiable approach for weighting” these inputs.

The interagency guidance therefore discusses certain common implementation issues, challenges and key considerations for addressing the new AMA framework. To this end, it focuses on the combination and use of the four required AMA data elements: internal operational loss event data, external operational loss event data, business environment and internal control factors, and scenario analysis. There is a particular stress on the data elements of this mix (in keeping with the current regulatory focus on data quality) and the regulators provide key metrics for data governance and validation.

As noted in the guidance paper: “The advanced approaches rule requires that a bank establish an operational risk management function (ORMF) that is responsible for the design, implementation, and oversight of the bank’s AMA framework, including operational risk data and assessment systems and operational risk quantification systems and related processes. The ORMF must be independent of business line management. The ORMF also should have an organisational stature commensurate with the bank’s operational risk profile.”

The internal and external data required under the guidance includes: gross operational loss amounts, dates, recoveries and relevant causal information for operational loss events occurring at the bank and outside the bank (for external data), all of which must be available for a period of at least five years. Given the granular nature of some of this data, the upshot of the guidance is that firms need to boost their risk related underlying data management systems in order to be able to store and report out this data in a consumable format. The paper also warns that those firms that fail to provide this data will receive special attention: “The agencies will scrutinise cases in which a bank excludes internal data from the estimation of operational risk severity, particularly the exclusion of tail events.”

In terms of external data, the regulators are looking for documentation why certain sources of data have been used and how they are incorporated into the internal risk system as a whole; filtering, validation and scrubbing of this data is required. An audit trail for this external data is therefore particularly important to prove that certain data governance minimum requirements have been met.

In terms of risk systems benchmarking, validation and testing, the guidance indicates that records of these processes should also be kept in good order for reporting processes if and when required. Internal audits are stressed by the agencies as a key tool in this process.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Are you making the most of the business-critical structured data stored in your mainframes?

17 June 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Fewer than 30% of companies think that they can fully tap into their mainframe data even though complete, accurate and real-time data is key to business decision-making, compliance, modernisation and innovation. For many in financial markets, integrating data across the enterprise...

BLOG

Implementing and Understanding Modern Data Architectures: Webinar Preview

The evolution of data use by financial institutions has been accompanied by ever-changing challenges to its management. With technologies such as artificial intelligence enabling firms to prise greater value from their data and to subject it to greater utilisation, a new set of data management practices have emerged. These modern data architectures regard data as...

EVENT

Data Licensing Forum 2025

The Data Licensing Forum will explore industry trends, themes and specific developments relating to licensing of financial market data from Exchanges and Data Vendors.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...