About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Systemic Risk Regulator Should Resemble the CRO Function and Data Utility is Likely to be Too Costly

Subscribe to our newsletter

The 22 respondents to the Securities Industry and Financial Markets Association’s recent survey (Sifma) believe that regardless of the exact structure of the systemic risk regulator, its roles and responsibilities for the overall markets should reflect those of a firm’s chief risk officer (CRO). The Sifma member firms, regulators, clearing counterparties (CCPs) and exchanges involved in the survey also said they feel that the data challenges involved in monitoring systemic risk could prove costly, might not require the building of a new infrastructure and should not be focused on too granular a level of data.

Sifma and the survey respondents are generally in favour of the introduction of a systemic risk regulator, such as the European Systemic Risk Board (ESRB), and willing to engage in more frequent reporting of potentially systemically risky data to such a body during times of stress. Tim Ryan, Sifma president and CEO, explains: “Sifma strongly supports the creation of a tough, competent systemic risk regulator to oversee systemically important firms so that the activities of one or a few firms will not threaten the stability of the entire financial system.” However, how this function is performed is not so easy to determine.

The overall goal of the Sifma paper is therefore to identify the full remit of potential systemic risk information requirements for such a body to function and examine the various methods of gathering this data together. To this end, the survey highlights the serious data issues at the heart of any attempt to monitor systemic risk at a global level, but instead of directly recommending a utility approach, as mooted by the European Central Bank (ECB) and included in the US reform bill, it suggests that existing infrastructure could be adapted for data collection purposes.

“Where possible, systemic risk regulation can be more effective by drawing on resources which already exist in the system, either in current regulatory filings or in firms’ own risk and information systems and leveraging infrastructure and repositories of data,” states the paper. Rather than building from scratch, the regulator should therefore aim to improve “data standards, consistency and accuracy” of the information it aggregates by focusing on metrics such as “quality and timeliness”.

The survey covers eight potential systemic risk information gathering approaches, all of which are described as “costly” by respondents. These comprise: an enterprise-wide stress test-based approach; reverse stress test-based approach; aggregated risk reporting templates; risk sensitivities; trade repositories; repositories and key industry utilities; concentration reporting analysis; and the establishment of a regulator data warehouse. The latter represents something like the US Office of Financial Research or the ECB’s proposed thin utility.

Sifma notes that there is no “single ideal approach” and that a mix and match approach to gathering this data would probably be the best option “Some are better at understanding certain drivers of systemic risks, while others are easier to aggregate across firms and products,” it suggests.

The Bank for International Settlements (BIS) has also recently examined the issue of systemic risk data collection and analysis. The BIS paper is echoed by the Sifma survey, which also elaborates on the need for the collection of more data in order to better monitor risk across the industry and fill the “information gaps” currently in existence. However, the Sifma survey indicates that such an endeavour should not focus on too granular a level of data detail. “Emphasis on processing too much information at too granular of a level may ultimately hinder the ability of a systemic risk regulator to focus on the relevant build up of systemic risks (hot spots); and result in missing the forest for the trees,” it states.

Survey respondents were concerned that a focus on granular position level data may cause the systemic risk regulator to be looking at the wrong “altitude” of information, and thus miss the bigger picture. They believe the current regulatory focus is “micro-prudential” rather than “macro-prudential” in this respect and this is what a systemically focused regulator must address.

“Current reporting metrics focus on the soundness of individual firms and do not effectively capture all of the drivers of broader sources of risk, but they do capture parts of the information needed to monitor systemic risk,” notes the paper. According to respondents, major areas of information that need to be improved include: firm size, interconnectedness, liquidity, concentration, correlation, tight coupling, herding behaviour, crowded trades and leverage.

Overall, the eight approaches suggested vary significantly in the resources that they would require to operate and their overall structure, whether bottom up or top down. The enterprise-wide stress test-based approach, for example, relies on firms’ internal models to produce a consistent set of stress tests and would require the development of specific scenarios by the regulator for firms to follow. Whereas the trade repositories-based approach requires the establishment of third party repository organisations such as the DTCC’s Trade Information Warehouse, which has been the subject of much debate recently, and the reporting of additional data sets by firms to these bodies.

In terms of the utility approach, the two most relevant approaches are the repositories and key industry utility firms-based approach and the data warehouse-based approach. The former recommends the use of existing infrastructures to gather together industry data and maintain it, whereas the latter involves the establishment of a whole new infrastructure.

Both of these involve costs to the industry as a whole and for individual firms, although the former is likely to involve significantly less effort than the latter. Connections would need to be established between the respective existing utilities and infrastructures and firms’ own systems but it would not require something to be built from the ground up.

The data warehouse’s technology and infrastructure challenges are identified by the report as: “Significant infrastructure development for financial institutions, utilities, and regulators. Significant costs and resource requirements for regulator(s) to maintain the data warehouse. License costs to access market vendor data. Data privacy concerns exist for sharing detailed customer specific information internationally.”

Moreover, it indicates that the Sifma roundtable participants were also not keen on the idea of a new warehouse overall: “The regulator data warehouse based approach received the most negative feedback as member firms believe that the approach is impractical for a systemic risk regulator to use once implemented. Member firms also cautioned that relying on an approach focused on massive quantities of granular information may provide a false comfort to a systemic risk regulator, who should consider a more holistic view, such as of overall trends, major concentrations and imbalances, and significant interconnections between firms.”

This supports a lot of the feedback that Reference Data Review has been hearing from the industry. The utility is certainly a contentious issue in light of the work required and the potentially high costs involved.

Subscribe to our newsletter

Related content


Upcoming Webinar: Perpetual KYC: compliance as the source of better business

Date: 15 September 2022 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Perpetual KYC (pKYC) opens the door for financial institutions and corporations to improve customer onboarding & monitoring processes, reduce operational costs, ensure regulatory compliance, and better understand risk exposures in real time. Unlike traditional or periodic KYC, pKYC continually...


A-Team Innovation Briefing Discusses Data Mesh, Transition to Cloud, Use of AI

A-Team Group’s Innovation Briefing series got off to a great start in London last week with an event dedicated to innovation in cloud. The briefing included practitioner interviews, provided insight into how to modernise data infrastructure, and reviewed technologies driving change. The case for data mesh Kicking off the briefing, and raising several delegate questions,...


TradingTech Summit Virtual (Redirected)

TradingTech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.


ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...