About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

US Regulatory Agencies Publish Op Risk Guidance Highlighting Importance of Data Inputs, Governance and Validation

Subscribe to our newsletter

The Office of the Comptroller of the Currency (OCC), along with the board of governors of the Federal Reserve System, the Federal Deposit Insurance Corporation (FDIC) and the Office of Thrift Supervision (OTS), have published their interagency guidance for operational risk as part of the US adoption of the advanced measurement approach (AMA) for risk-based capital requirements, which includes comprehensive data input details and data governance and validation recommendations. The agencies indicate that they have focused on the operational risk data input aspects of the guidelines due to the need for “a credible, transparent, systematic, and verifiable approach for weighting” these inputs.

The interagency guidance therefore discusses certain common implementation issues, challenges and key considerations for addressing the new AMA framework. To this end, it focuses on the combination and use of the four required AMA data elements: internal operational loss event data, external operational loss event data, business environment and internal control factors, and scenario analysis. There is a particular stress on the data elements of this mix (in keeping with the current regulatory focus on data quality) and the regulators provide key metrics for data governance and validation.

As noted in the guidance paper: “The advanced approaches rule requires that a bank establish an operational risk management function (ORMF) that is responsible for the design, implementation, and oversight of the bank’s AMA framework, including operational risk data and assessment systems and operational risk quantification systems and related processes. The ORMF must be independent of business line management. The ORMF also should have an organisational stature commensurate with the bank’s operational risk profile.”

The internal and external data required under the guidance includes: gross operational loss amounts, dates, recoveries and relevant causal information for operational loss events occurring at the bank and outside the bank (for external data), all of which must be available for a period of at least five years. Given the granular nature of some of this data, the upshot of the guidance is that firms need to boost their risk related underlying data management systems in order to be able to store and report out this data in a consumable format. The paper also warns that those firms that fail to provide this data will receive special attention: “The agencies will scrutinise cases in which a bank excludes internal data from the estimation of operational risk severity, particularly the exclusion of tail events.”

In terms of external data, the regulators are looking for documentation why certain sources of data have been used and how they are incorporated into the internal risk system as a whole; filtering, validation and scrubbing of this data is required. An audit trail for this external data is therefore particularly important to prove that certain data governance minimum requirements have been met.

In terms of risk systems benchmarking, validation and testing, the guidance indicates that records of these processes should also be kept in good order for reporting processes if and when required. Internal audits are stressed by the agencies as a key tool in this process.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking Transparency in Private Markets: Data-Driven Strategies in Asset Management

As asset managers continue to increase their allocations in private assets, the demand for greater transparency, risk oversight, and operational efficiency is growing rapidly. Managing private markets data presents its own set of unique challenges due to a lack of transparency, disparate sources and lack of standardization. Without reliable access, your firm may face inefficiencies,...

BLOG

7Rivers Q&A: Enabling Modern Data Processing

Milwaukee, Wisconsin-based 7Rivers gives its clients the tools to draw actionable insights and real-world applications from their data. A-Team Group Data Management Insight spoke to Jessica Emhoff, Vice President of Marketing, about the company and how it is empowering financial institutions. Data Management Insight: Hello Jessica. Can you tell us a bit about how 7Rivers...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...