About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

US Regulatory Agencies Publish Op Risk Guidance Highlighting Importance of Data Inputs, Governance and Validation

Subscribe to our newsletter

The Office of the Comptroller of the Currency (OCC), along with the board of governors of the Federal Reserve System, the Federal Deposit Insurance Corporation (FDIC) and the Office of Thrift Supervision (OTS), have published their interagency guidance for operational risk as part of the US adoption of the advanced measurement approach (AMA) for risk-based capital requirements, which includes comprehensive data input details and data governance and validation recommendations. The agencies indicate that they have focused on the operational risk data input aspects of the guidelines due to the need for “a credible, transparent, systematic, and verifiable approach for weighting” these inputs.

The interagency guidance therefore discusses certain common implementation issues, challenges and key considerations for addressing the new AMA framework. To this end, it focuses on the combination and use of the four required AMA data elements: internal operational loss event data, external operational loss event data, business environment and internal control factors, and scenario analysis. There is a particular stress on the data elements of this mix (in keeping with the current regulatory focus on data quality) and the regulators provide key metrics for data governance and validation.

As noted in the guidance paper: “The advanced approaches rule requires that a bank establish an operational risk management function (ORMF) that is responsible for the design, implementation, and oversight of the bank’s AMA framework, including operational risk data and assessment systems and operational risk quantification systems and related processes. The ORMF must be independent of business line management. The ORMF also should have an organisational stature commensurate with the bank’s operational risk profile.”

The internal and external data required under the guidance includes: gross operational loss amounts, dates, recoveries and relevant causal information for operational loss events occurring at the bank and outside the bank (for external data), all of which must be available for a period of at least five years. Given the granular nature of some of this data, the upshot of the guidance is that firms need to boost their risk related underlying data management systems in order to be able to store and report out this data in a consumable format. The paper also warns that those firms that fail to provide this data will receive special attention: “The agencies will scrutinise cases in which a bank excludes internal data from the estimation of operational risk severity, particularly the exclusion of tail events.”

In terms of external data, the regulators are looking for documentation why certain sources of data have been used and how they are incorporated into the internal risk system as a whole; filtering, validation and scrubbing of this data is required. An audit trail for this external data is therefore particularly important to prove that certain data governance minimum requirements have been met.

In terms of risk systems benchmarking, validation and testing, the guidance indicates that records of these processes should also be kept in good order for reporting processes if and when required. Internal audits are stressed by the agencies as a key tool in this process.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practice approaches to data management for regulatory reporting

Effective regulatory reporting requires firms to manage vast amounts of data across multiple systems, regions, and regulatory jurisdictions. With increasing scrutiny from regulators and the rising complexity of financial instruments, the need for a streamlined and strategic approach to data management has never been greater. Financial institutions must ensure accuracy, consistency, and timeliness in their...

BLOG

S&P Global Data via Cloud: Unlocking Real-Time, Scalable Insights with Snowflake and Databricks Delta Sharing

As organisations accelerate their cloud migration strategies to manage growing volumes of structured and unstructured data, demand is rising for secure, real-time, cloud-native access to trusted datasets. Leveraging Snowflake and Databricks Delta Sharing, S&P Global provides a scalable, agile foundation that allows organizations to directly access and query S&P Global and curated third-party datasets without...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...