About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Systemic Risk Regulator Should Resemble the CRO Function and Data Utility is Likely to be Too Costly

Subscribe to our newsletter

The 22 respondents to the Securities Industry and Financial Markets Association’s recent survey (Sifma) believe that regardless of the exact structure of the systemic risk regulator, its roles and responsibilities for the overall markets should reflect those of a firm’s chief risk officer (CRO). The Sifma member firms, regulators, clearing counterparties (CCPs) and exchanges involved in the survey also said they feel that the data challenges involved in monitoring systemic risk could prove costly, might not require the building of a new infrastructure and should not be focused on too granular a level of data.

Sifma and the survey respondents are generally in favour of the introduction of a systemic risk regulator, such as the European Systemic Risk Board (ESRB), and willing to engage in more frequent reporting of potentially systemically risky data to such a body during times of stress. Tim Ryan, Sifma president and CEO, explains: “Sifma strongly supports the creation of a tough, competent systemic risk regulator to oversee systemically important firms so that the activities of one or a few firms will not threaten the stability of the entire financial system.” However, how this function is performed is not so easy to determine.

The overall goal of the Sifma paper is therefore to identify the full remit of potential systemic risk information requirements for such a body to function and examine the various methods of gathering this data together. To this end, the survey highlights the serious data issues at the heart of any attempt to monitor systemic risk at a global level, but instead of directly recommending a utility approach, as mooted by the European Central Bank (ECB) and included in the US reform bill, it suggests that existing infrastructure could be adapted for data collection purposes.

“Where possible, systemic risk regulation can be more effective by drawing on resources which already exist in the system, either in current regulatory filings or in firms’ own risk and information systems and leveraging infrastructure and repositories of data,” states the paper. Rather than building from scratch, the regulator should therefore aim to improve “data standards, consistency and accuracy” of the information it aggregates by focusing on metrics such as “quality and timeliness”.

The survey covers eight potential systemic risk information gathering approaches, all of which are described as “costly” by respondents. These comprise: an enterprise-wide stress test-based approach; reverse stress test-based approach; aggregated risk reporting templates; risk sensitivities; trade repositories; repositories and key industry utilities; concentration reporting analysis; and the establishment of a regulator data warehouse. The latter represents something like the US Office of Financial Research or the ECB’s proposed thin utility.

Sifma notes that there is no “single ideal approach” and that a mix and match approach to gathering this data would probably be the best option “Some are better at understanding certain drivers of systemic risks, while others are easier to aggregate across firms and products,” it suggests.

The Bank for International Settlements (BIS) has also recently examined the issue of systemic risk data collection and analysis. The BIS paper is echoed by the Sifma survey, which also elaborates on the need for the collection of more data in order to better monitor risk across the industry and fill the “information gaps” currently in existence. However, the Sifma survey indicates that such an endeavour should not focus on too granular a level of data detail. “Emphasis on processing too much information at too granular of a level may ultimately hinder the ability of a systemic risk regulator to focus on the relevant build up of systemic risks (hot spots); and result in missing the forest for the trees,” it states.

Survey respondents were concerned that a focus on granular position level data may cause the systemic risk regulator to be looking at the wrong “altitude” of information, and thus miss the bigger picture. They believe the current regulatory focus is “micro-prudential” rather than “macro-prudential” in this respect and this is what a systemically focused regulator must address.

“Current reporting metrics focus on the soundness of individual firms and do not effectively capture all of the drivers of broader sources of risk, but they do capture parts of the information needed to monitor systemic risk,” notes the paper. According to respondents, major areas of information that need to be improved include: firm size, interconnectedness, liquidity, concentration, correlation, tight coupling, herding behaviour, crowded trades and leverage.

Overall, the eight approaches suggested vary significantly in the resources that they would require to operate and their overall structure, whether bottom up or top down. The enterprise-wide stress test-based approach, for example, relies on firms’ internal models to produce a consistent set of stress tests and would require the development of specific scenarios by the regulator for firms to follow. Whereas the trade repositories-based approach requires the establishment of third party repository organisations such as the DTCC’s Trade Information Warehouse, which has been the subject of much debate recently, and the reporting of additional data sets by firms to these bodies.

In terms of the utility approach, the two most relevant approaches are the repositories and key industry utility firms-based approach and the data warehouse-based approach. The former recommends the use of existing infrastructures to gather together industry data and maintain it, whereas the latter involves the establishment of a whole new infrastructure.

Both of these involve costs to the industry as a whole and for individual firms, although the former is likely to involve significantly less effort than the latter. Connections would need to be established between the respective existing utilities and infrastructures and firms’ own systems but it would not require something to be built from the ground up.

The data warehouse’s technology and infrastructure challenges are identified by the report as: “Significant infrastructure development for financial institutions, utilities, and regulators. Significant costs and resource requirements for regulator(s) to maintain the data warehouse. License costs to access market vendor data. Data privacy concerns exist for sharing detailed customer specific information internationally.”

Moreover, it indicates that the Sifma roundtable participants were also not keen on the idea of a new warehouse overall: “The regulator data warehouse based approach received the most negative feedback as member firms believe that the approach is impractical for a systemic risk regulator to use once implemented. Member firms also cautioned that relying on an approach focused on massive quantities of granular information may provide a false comfort to a systemic risk regulator, who should consider a more holistic view, such as of overall trends, major concentrations and imbalances, and significant interconnections between firms.”

This supports a lot of the feedback that Reference Data Review has been hearing from the industry. The utility is certainly a contentious issue in light of the work required and the potentially high costs involved.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Strategies, tools and techniques to extract value from unstructured data

Unstructured data is voluminous, unwieldy and difficult to store and manage. For capital markets participants, it is also key to generating business insight, making better-informed decisions and delivering both internal and external value. Solving this dichotomy can be a challenge, but there are solutions designed to help financial institutions digest, manage and make best use...

BLOG

Digital Communications Governance and Archiving: Pioneered by Theta Lake

By Garth Landers, Director of Product Marketing, Theta Lake. In today’s rapidly evolving digital landscape, the term “regulatory tsunami” has become a common phrase, reflecting the immense pressure organizations face to maintain compliance across a myriad of communication channels. As businesses increasingly adopt modern collaboration tools, the challenge of ensuring compliance has grown exponentially. Since...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...