About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Systemic Risk Regulator Should Resemble the CRO Function and Data Utility is Likely to be Too Costly

Subscribe to our newsletter

The 22 respondents to the Securities Industry and Financial Markets Association’s recent survey (Sifma) believe that regardless of the exact structure of the systemic risk regulator, its roles and responsibilities for the overall markets should reflect those of a firm’s chief risk officer (CRO). The Sifma member firms, regulators, clearing counterparties (CCPs) and exchanges involved in the survey also said they feel that the data challenges involved in monitoring systemic risk could prove costly, might not require the building of a new infrastructure and should not be focused on too granular a level of data.

Sifma and the survey respondents are generally in favour of the introduction of a systemic risk regulator, such as the European Systemic Risk Board (ESRB), and willing to engage in more frequent reporting of potentially systemically risky data to such a body during times of stress. Tim Ryan, Sifma president and CEO, explains: “Sifma strongly supports the creation of a tough, competent systemic risk regulator to oversee systemically important firms so that the activities of one or a few firms will not threaten the stability of the entire financial system.” However, how this function is performed is not so easy to determine.

The overall goal of the Sifma paper is therefore to identify the full remit of potential systemic risk information requirements for such a body to function and examine the various methods of gathering this data together. To this end, the survey highlights the serious data issues at the heart of any attempt to monitor systemic risk at a global level, but instead of directly recommending a utility approach, as mooted by the European Central Bank (ECB) and included in the US reform bill, it suggests that existing infrastructure could be adapted for data collection purposes.

“Where possible, systemic risk regulation can be more effective by drawing on resources which already exist in the system, either in current regulatory filings or in firms’ own risk and information systems and leveraging infrastructure and repositories of data,” states the paper. Rather than building from scratch, the regulator should therefore aim to improve “data standards, consistency and accuracy” of the information it aggregates by focusing on metrics such as “quality and timeliness”.

The survey covers eight potential systemic risk information gathering approaches, all of which are described as “costly” by respondents. These comprise: an enterprise-wide stress test-based approach; reverse stress test-based approach; aggregated risk reporting templates; risk sensitivities; trade repositories; repositories and key industry utilities; concentration reporting analysis; and the establishment of a regulator data warehouse. The latter represents something like the US Office of Financial Research or the ECB’s proposed thin utility.

Sifma notes that there is no “single ideal approach” and that a mix and match approach to gathering this data would probably be the best option “Some are better at understanding certain drivers of systemic risks, while others are easier to aggregate across firms and products,” it suggests.

The Bank for International Settlements (BIS) has also recently examined the issue of systemic risk data collection and analysis. The BIS paper is echoed by the Sifma survey, which also elaborates on the need for the collection of more data in order to better monitor risk across the industry and fill the “information gaps” currently in existence. However, the Sifma survey indicates that such an endeavour should not focus on too granular a level of data detail. “Emphasis on processing too much information at too granular of a level may ultimately hinder the ability of a systemic risk regulator to focus on the relevant build up of systemic risks (hot spots); and result in missing the forest for the trees,” it states.

Survey respondents were concerned that a focus on granular position level data may cause the systemic risk regulator to be looking at the wrong “altitude” of information, and thus miss the bigger picture. They believe the current regulatory focus is “micro-prudential” rather than “macro-prudential” in this respect and this is what a systemically focused regulator must address.

“Current reporting metrics focus on the soundness of individual firms and do not effectively capture all of the drivers of broader sources of risk, but they do capture parts of the information needed to monitor systemic risk,” notes the paper. According to respondents, major areas of information that need to be improved include: firm size, interconnectedness, liquidity, concentration, correlation, tight coupling, herding behaviour, crowded trades and leverage.

Overall, the eight approaches suggested vary significantly in the resources that they would require to operate and their overall structure, whether bottom up or top down. The enterprise-wide stress test-based approach, for example, relies on firms’ internal models to produce a consistent set of stress tests and would require the development of specific scenarios by the regulator for firms to follow. Whereas the trade repositories-based approach requires the establishment of third party repository organisations such as the DTCC’s Trade Information Warehouse, which has been the subject of much debate recently, and the reporting of additional data sets by firms to these bodies.

In terms of the utility approach, the two most relevant approaches are the repositories and key industry utility firms-based approach and the data warehouse-based approach. The former recommends the use of existing infrastructures to gather together industry data and maintain it, whereas the latter involves the establishment of a whole new infrastructure.

Both of these involve costs to the industry as a whole and for individual firms, although the former is likely to involve significantly less effort than the latter. Connections would need to be established between the respective existing utilities and infrastructures and firms’ own systems but it would not require something to be built from the ground up.

The data warehouse’s technology and infrastructure challenges are identified by the report as: “Significant infrastructure development for financial institutions, utilities, and regulators. Significant costs and resource requirements for regulator(s) to maintain the data warehouse. License costs to access market vendor data. Data privacy concerns exist for sharing detailed customer specific information internationally.”

Moreover, it indicates that the Sifma roundtable participants were also not keen on the idea of a new warehouse overall: “The regulator data warehouse based approach received the most negative feedback as member firms believe that the approach is impractical for a systemic risk regulator to use once implemented. Member firms also cautioned that relying on an approach focused on massive quantities of granular information may provide a false comfort to a systemic risk regulator, who should consider a more holistic view, such as of overall trends, major concentrations and imbalances, and significant interconnections between firms.”

This supports a lot of the feedback that Reference Data Review has been hearing from the industry. The utility is certainly a contentious issue in light of the work required and the potentially high costs involved.

Subscribe to our newsletter

Related content


Upcoming Webinar: Practical considerations for regulatory change management

Date: 18 September 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Regulatory change management has become a norm across financial markets but a challenge for financial institutions that must monitor, manage and adapt to ensure compliance with both minor and major adjustments to obligations. This year is particularly troublesome, with...


The Challenge of Data Integration in a Multiple Data Source World

By Inesa Smigola, Head of Presales, EMEA and APAC at Xceptor. Financial institutions have a growing data challenge – ever increasing data volumes, much of it unstructured, multiple data sources, and hugely varied data formats and structures. Across this is the additional challenge of inconsistent data quality according to data source and format– an Excel...


RegTech Summit New York

Now in its 8th year, the RegTech Summit in New York will bring together the regtech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.


Enterprise Data Management, 2009 Edition

This year has truly been a year of change for the data management community. Regulators and industry participants alike have been keenly focused on the importance of data with regards to compliance and risk management considerations. The UK Financial Services Authority’s fining of Barclays for transaction reporting failures as a result of inconsistent underlying reference...