The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Senior Supervisors Group Report Highlights Need for Better Management of Risk Data

Share article

The aggregation of risk data remains a challenge within most financial institutions, according to a report released last month by the Senior Supervisors Group (SSG), which is comprised of 13 regulatory bodies from 10 countries. The report, Observations on Developments in Risk Appetite Frameworks and IT Infrastructure, indicates that although many firms have kicked off projects to tackle risk IT challenges, underlying data management issues continue to pose problems that hamper firms’ ability to make “forward looking and well informed strategic decisions”.

The report summarises the efforts of two SSG working groups to assess the progress that financial institutions have made in developing risk appetite frameworks and building robust information technology infrastructures. It follows on from a similar exercise in the previous year, which highlighted critical areas of risk management practice that the SSG felt warranted improvement across the financial services industry, including tackling firms’ fragmented risk IT infrastructures, poorly integrated data and their inadequate liquidity risk management. The 2010 report results indicate that although some progress has been made in the more general risk technology space, data management challenges persist.

Observations in the report are drawn from the SSG’s collective supervisory work undertaken in 2010, which included formal examinations conducted by individual supervisory agencies, meetings with firms’ management and detailed reviews of firms’ remediation plans.

The goal of the 2010 SSG report is to provide guidance and support to systemically important financial institutions around their investment in new technology architectures. To this end, it supports the Financial Stability Board’s (FSB) November report, which urged regulators to ensure these firms “develop and maintain state of the art risk appetite and data aggregation capabilities,” and the general trend within the regulator and practitioner communities over recent months to emphasise the importance of data quality. For example, at the September A-Team Insight Exchange conference in London, Mizuho International’s risk management chief operating officer Simon Tweddle explained the paramount importance of risk data quality from both a regulatory reporting and business perspective.

The report notes that many “multi-year projects to improve IT infrastructure” have been kicked off across the industry and firms have made some progress in “conceptualising, articulating, and implementing a risk appetite framework,” with some firms requiring much more work to this end than others. However, it notes that it is still “unclear” whether enough progress has been made in enhancing the aggregation of underlying risk data, in order for this to be carried out “accurately, comprehensively and quickly” enough to adequately support business decisions.

“Some firms still require days or weeks to accurately and completely aggregate risk exposures; few firms can aggregate data within a single business day,” states the report.

The firms that have made the most progress in tackling these underlying challenges are (unsurprisingly) those that have the active engagement of C-level execs in the project and the presence of an empowered chief risk officer (CRO). SSG indicates the vital importance of communication, strong governance, accountability and incentives around ensuring data quality; issues that data managers have long been aware of in this context. The terms used within the report will also be familiar to the data management community: much is made of the need for a “common language” and “metrics” against which to measure these data items, and a “partnership” approach to the challenge of building a comprehensive risk data infrastructure.

In fact, the report appears largely to be a basic ‘how to’ in terms of kicking off a sensible enterprise data management project. It highlights: the need for automation and a move away from manual processes; a strong governance process; and a move from disparate (siloed) IT systems to integrated firm-wide infrastructure. Sound familiar?

So what is the regulatory community looking for? “Firms with highly developed IT infrastructures are able to clearly articulate, document, and communicate internal risk reporting requirements, including specific metrics, data accuracy expectations, element definitions, and timeframes. These requirements also incorporate supervisory expectations and regulatory reporting requirements, such as segmenting financial and risk data on a legal entity basis. The technology planning process has to align both business and IT strategies to ensure that a productive partnership exists and that it values the investments made in financial and human resources to complete the project. We have observed that strategic business expansion at most firms occurs before they have fully incorporated IT requirements, often putting IT implementation plans far behind the business plans and creating volume and data capacity issues when the business or product grows,” says the report.

Granted, the findings aren’t rocket science, but the fact that the regulatory community is pushing this approach should help data managers to continue to ask for investment in these projects. The SSG is aware of the benefits of data management and keen to ensure the C-level is taking all of this advice on board.

Contributors to the report include the Canadian Office of the Superintendent of Financial Institutions, the French Prudential Control Authority, the German Federal Financial Supervisory Authority, the Bank of Italy, the Japanese Financial Services Agency, the Netherlands Bank, the Bank of Spain, the Swiss Financial Market Supervisory Authority, the UK Financial Services Authority (FSA), and four US regulators: the Office of the Comptroller of the Currency, the Securities and Exchange Commission (SEC), the board of governors of the Fed and the Fed itself.

Related content


Recorded Webinar: Best Practices for Integrated Regulatory Reporting Across Multiple Jurisdictions

The regulatory reporting obligations of financial institutions have mushroomed in scale over the past decade, leaving firms facing a raft of different requirements to provide increasingly granular metrics on their transaction, valuation and collateral data to a number of regulatory authorities. While many of these reports draw from the same core data set, the nuanced differences...


Approaches to Regulatory Trade Reporting: A Fireside Chat

This week, RegTech Insight was delighted to welcome Mark Solomon, Director of Trading & Securities Operations at Brandywine Global Investment Management, who sat down with our very own Chief of Content Andrew Delaney to discuss approaches to regulatory trade reporting and, in particular, the challenges of reporting across multiple jurisdictions for multiple regulations. Brandywine, a...


RegTech Summit Virtual

The highly successful RegTech Summit Virtual was held in November 2020 and explored how business and operating models are adapting post COVID and how RegTech can provide agile and enhanced compliance for managing an evolving risk and compliance landscape. The event featured daily live keynotes, panel discussions, presentations, fireside chats and Q&A sessions with content available on demand over five days.


Regulatory Data Handbook 2020/2021 – Eighth Edition

This eighth edition of A-Team Group’s Regulatory Data Handbook is a ‘must-have’ for capital markets participants during this period of unprecedented change. Available free of charge, it profiles every regulation that impacts capital markets data management practices giving you: A detailed overview of each regulation with key dates, data and data management implications, links to...