About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Senior Supervisors Group Report Highlights Need for Better Management of Risk Data

Subscribe to our newsletter

The aggregation of risk data remains a challenge within most financial institutions, according to a report released last month by the Senior Supervisors Group (SSG), which is comprised of 13 regulatory bodies from 10 countries. The report, Observations on Developments in Risk Appetite Frameworks and IT Infrastructure, indicates that although many firms have kicked off projects to tackle risk IT challenges, underlying data management issues continue to pose problems that hamper firms’ ability to make “forward looking and well informed strategic decisions”.

The report summarises the efforts of two SSG working groups to assess the progress that financial institutions have made in developing risk appetite frameworks and building robust information technology infrastructures. It follows on from a similar exercise in the previous year, which highlighted critical areas of risk management practice that the SSG felt warranted improvement across the financial services industry, including tackling firms’ fragmented risk IT infrastructures, poorly integrated data and their inadequate liquidity risk management. The 2010 report results indicate that although some progress has been made in the more general risk technology space, data management challenges persist.

Observations in the report are drawn from the SSG’s collective supervisory work undertaken in 2010, which included formal examinations conducted by individual supervisory agencies, meetings with firms’ management and detailed reviews of firms’ remediation plans.

The goal of the 2010 SSG report is to provide guidance and support to systemically important financial institutions around their investment in new technology architectures. To this end, it supports the Financial Stability Board’s (FSB) November report, which urged regulators to ensure these firms “develop and maintain state of the art risk appetite and data aggregation capabilities,” and the general trend within the regulator and practitioner communities over recent months to emphasise the importance of data quality. For example, at the September A-Team Insight Exchange conference in London, Mizuho International’s risk management chief operating officer Simon Tweddle explained the paramount importance of risk data quality from both a regulatory reporting and business perspective.

The report notes that many “multi-year projects to improve IT infrastructure” have been kicked off across the industry and firms have made some progress in “conceptualising, articulating, and implementing a risk appetite framework,” with some firms requiring much more work to this end than others. However, it notes that it is still “unclear” whether enough progress has been made in enhancing the aggregation of underlying risk data, in order for this to be carried out “accurately, comprehensively and quickly” enough to adequately support business decisions.

“Some firms still require days or weeks to accurately and completely aggregate risk exposures; few firms can aggregate data within a single business day,” states the report.

The firms that have made the most progress in tackling these underlying challenges are (unsurprisingly) those that have the active engagement of C-level execs in the project and the presence of an empowered chief risk officer (CRO). SSG indicates the vital importance of communication, strong governance, accountability and incentives around ensuring data quality; issues that data managers have long been aware of in this context. The terms used within the report will also be familiar to the data management community: much is made of the need for a “common language” and “metrics” against which to measure these data items, and a “partnership” approach to the challenge of building a comprehensive risk data infrastructure.

In fact, the report appears largely to be a basic ‘how to’ in terms of kicking off a sensible enterprise data management project. It highlights: the need for automation and a move away from manual processes; a strong governance process; and a move from disparate (siloed) IT systems to integrated firm-wide infrastructure. Sound familiar?

So what is the regulatory community looking for? “Firms with highly developed IT infrastructures are able to clearly articulate, document, and communicate internal risk reporting requirements, including specific metrics, data accuracy expectations, element definitions, and timeframes. These requirements also incorporate supervisory expectations and regulatory reporting requirements, such as segmenting financial and risk data on a legal entity basis. The technology planning process has to align both business and IT strategies to ensure that a productive partnership exists and that it values the investments made in financial and human resources to complete the project. We have observed that strategic business expansion at most firms occurs before they have fully incorporated IT requirements, often putting IT implementation plans far behind the business plans and creating volume and data capacity issues when the business or product grows,” says the report.

Granted, the findings aren’t rocket science, but the fact that the regulatory community is pushing this approach should help data managers to continue to ask for investment in these projects. The SSG is aware of the benefits of data management and keen to ensure the C-level is taking all of this advice on board.

Contributors to the report include the Canadian Office of the Superintendent of Financial Institutions, the French Prudential Control Authority, the German Federal Financial Supervisory Authority, the Bank of Italy, the Japanese Financial Services Agency, the Netherlands Bank, the Bank of Spain, the Swiss Financial Market Supervisory Authority, the UK Financial Services Authority (FSA), and four US regulators: the Office of the Comptroller of the Currency, the Securities and Exchange Commission (SEC), the board of governors of the Fed and the Fed itself.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to leverage Generative AI and Large Language Models for regulatory compliance

Date: 8 May 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Generative AI (GenAI) and Large Language Models (LLMs) offer huge potential for change across capital markets, not least in regulatory compliance where they have the capability to help firms understand and interpret regulations, automate compliance, monitor transactions in real...

BLOG

Moody’s Gen AI Research Assistant Offers New Insights from Research, Data and Analytics

Moody’s has released Moody’s Research Assistant, a search and analytical tool powered by generative AI and using the company’s proprietary content and the latest large language models (LLMs) to help customers generate new insights from its credit research, data, and analytics. The research assistant synthesises vast amounts of information allowing users to assess lending or...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...