The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Systemic Risk Regulator Should Resemble the CRO Function and Data Utility is Likely to be Too Costly

The 22 respondents to the Securities Industry and Financial Markets Association’s recent survey (Sifma) believe that regardless of the exact structure of the systemic risk regulator, its roles and responsibilities for the overall markets should reflect those of a firm’s chief risk officer (CRO). The Sifma member firms, regulators, clearing counterparties (CCPs) and exchanges involved in the survey also said they feel that the data challenges involved in monitoring systemic risk could prove costly, might not require the building of a new infrastructure and should not be focused on too granular a level of data.

Sifma and the survey respondents are generally in favour of the introduction of a systemic risk regulator, such as the European Systemic Risk Board (ESRB), and willing to engage in more frequent reporting of potentially systemically risky data to such a body during times of stress. Tim Ryan, Sifma president and CEO, explains: “Sifma strongly supports the creation of a tough, competent systemic risk regulator to oversee systemically important firms so that the activities of one or a few firms will not threaten the stability of the entire financial system.” However, how this function is performed is not so easy to determine.

The overall goal of the Sifma paper is therefore to identify the full remit of potential systemic risk information requirements for such a body to function and examine the various methods of gathering this data together. To this end, the survey highlights the serious data issues at the heart of any attempt to monitor systemic risk at a global level, but instead of directly recommending a utility approach, as mooted by the European Central Bank (ECB) and included in the US reform bill, it suggests that existing infrastructure could be adapted for data collection purposes.

“Where possible, systemic risk regulation can be more effective by drawing on resources which already exist in the system, either in current regulatory filings or in firms’ own risk and information systems and leveraging infrastructure and repositories of data,” states the paper. Rather than building from scratch, the regulator should therefore aim to improve “data standards, consistency and accuracy” of the information it aggregates by focusing on metrics such as “quality and timeliness”.

The survey covers eight potential systemic risk information gathering approaches, all of which are described as “costly” by respondents. These comprise: an enterprise-wide stress test-based approach; reverse stress test-based approach; aggregated risk reporting templates; risk sensitivities; trade repositories; repositories and key industry utilities; concentration reporting analysis; and the establishment of a regulator data warehouse. The latter represents something like the US Office of Financial Research or the ECB’s proposed thin utility.

Sifma notes that there is no “single ideal approach” and that a mix and match approach to gathering this data would probably be the best option “Some are better at understanding certain drivers of systemic risks, while others are easier to aggregate across firms and products,” it suggests.

The Bank for International Settlements (BIS) has also recently examined the issue of systemic risk data collection and analysis. The BIS paper is echoed by the Sifma survey, which also elaborates on the need for the collection of more data in order to better monitor risk across the industry and fill the “information gaps” currently in existence. However, the Sifma survey indicates that such an endeavour should not focus on too granular a level of data detail. “Emphasis on processing too much information at too granular of a level may ultimately hinder the ability of a systemic risk regulator to focus on the relevant build up of systemic risks (hot spots); and result in missing the forest for the trees,” it states.

Survey respondents were concerned that a focus on granular position level data may cause the systemic risk regulator to be looking at the wrong “altitude” of information, and thus miss the bigger picture. They believe the current regulatory focus is “micro-prudential” rather than “macro-prudential” in this respect and this is what a systemically focused regulator must address.

“Current reporting metrics focus on the soundness of individual firms and do not effectively capture all of the drivers of broader sources of risk, but they do capture parts of the information needed to monitor systemic risk,” notes the paper. According to respondents, major areas of information that need to be improved include: firm size, interconnectedness, liquidity, concentration, correlation, tight coupling, herding behaviour, crowded trades and leverage.

Overall, the eight approaches suggested vary significantly in the resources that they would require to operate and their overall structure, whether bottom up or top down. The enterprise-wide stress test-based approach, for example, relies on firms’ internal models to produce a consistent set of stress tests and would require the development of specific scenarios by the regulator for firms to follow. Whereas the trade repositories-based approach requires the establishment of third party repository organisations such as the DTCC’s Trade Information Warehouse, which has been the subject of much debate recently, and the reporting of additional data sets by firms to these bodies.

In terms of the utility approach, the two most relevant approaches are the repositories and key industry utility firms-based approach and the data warehouse-based approach. The former recommends the use of existing infrastructures to gather together industry data and maintain it, whereas the latter involves the establishment of a whole new infrastructure.

Both of these involve costs to the industry as a whole and for individual firms, although the former is likely to involve significantly less effort than the latter. Connections would need to be established between the respective existing utilities and infrastructures and firms’ own systems but it would not require something to be built from the ground up.

The data warehouse’s technology and infrastructure challenges are identified by the report as: “Significant infrastructure development for financial institutions, utilities, and regulators. Significant costs and resource requirements for regulator(s) to maintain the data warehouse. License costs to access market vendor data. Data privacy concerns exist for sharing detailed customer specific information internationally.”

Moreover, it indicates that the Sifma roundtable participants were also not keen on the idea of a new warehouse overall: “The regulator data warehouse based approach received the most negative feedback as member firms believe that the approach is impractical for a systemic risk regulator to use once implemented. Member firms also cautioned that relying on an approach focused on massive quantities of granular information may provide a false comfort to a systemic risk regulator, who should consider a more holistic view, such as of overall trends, major concentrations and imbalances, and significant interconnections between firms.”

This supports a lot of the feedback that Reference Data Review has been hearing from the industry. The utility is certainly a contentious issue in light of the work required and the potentially high costs involved.

Related content

WEBINAR

Recorded Webinar: Getting ready for Sustainable Finance Disclosure Regulation (SFDR) and ESG – what action should asset managers be taking now?

Interest in Environmental, Social and Governance (ESG) investment has exploded in recent years, bringing with it regulation and a requirement for buy-side firms to develop ESG strategies and meet disclosure obligations. The sell-side can help here by integrating ESG data with traditional financial information, although the compliance burden remains with asset managers. The EU Sustainable...

BLOG

GLEIF Adds Two Validation Agents in Drive to Increase Global LEI Adoption

The Global LEI Foundation (GLEIF) has added two validation agents to its LEI issuance scheme in recent weeks, easing adoption of the identifier and extending its reach in Africa. The foundation has also published its 2020 annual report, noting a rise in LEI numbers and a 2021 focus on the validation agent role introduced in...

EVENT

Data Management Summit USA Virtual (Redirected)

The highly successful Data Management Summit USA Virtual was held in September 2020 and explored how sell side and buy side financial institutions are navigating the global crisis and adapting their data strategies to manage in today’s new normal environment.

GUIDE

Regulatory Data Handbook 2021/2022 – Ninth Edition

Welcome to the ninth edition of A-Team Group’s Regulatory Data Handbook, a publication dedicated to helping you gain a full understanding of regulations related to your organisation from the details of requirements to best practice implementation. This edition of the handbook includes a focus on regulations being rolled out to bring order and standardisation to...