The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

A Rethink, Redesign and Retool is Needed for Systemic Risk Monitoring, Contends JWG’s Di Giammarino

Share article

In order to be able to properly monitor systemic risk, the industry first needs to understand the gaps in the current data framework and then rethink, redesign and retool the method of controlling “the system”, according to PJ Di Giammarino, CEO of JWG. Speaking at the FS Club event on systemic risk last week, Di Giammarino noted that before the regulatory community jumps into mandating data standards for systemic risk monitoring purposes, for example, it first needs to consult with the industry to define a specific roadmap for change.

Following on from the JWG event on funds transfer pricing earlier in the day, which also featured discussions on risk data challenges, the FS Club panel was charged with debating the tools needed to monitor systemic risk across the industry, including reference data standards for instrument and entity identification. Di Giammarino contended that the industry is only beginning to understand what is required to monitor systemic risk but may be jumping ahead of itself without first considering the practicalities.

“There has been a lot of intellectual debate and policy points, but very little in the way of discussion about the practicalities of monitoring systemic risk. There are also significant disconnects across the G20 with regards to approaches to information management that could cause regulatory further silos,” he explained. “There needs to be a complete rethink in the way we approach systemic risk.”

Di Giammarino believes there are three steps that need to be taken first before a roadmap for change can be drawn up: “rethink, redesign and retool”. A rethink is needed to be able to align macro and micro views, although this will be a difficult and costly process, he explained. Regulatory information silos are difficult to locate and to link today (just look at transaction reporting across Europe for an example) and therefore only a partial view of system data is available; hence a “redesign” is required. A retool would therefore involve a global approach to aggregating and interpreting the correct data, he added.

The current assumptions about how to manage risk data need to be fundamentally challenged, according to Di Giammarino, such as appreciating the consequences of getting it wrong. “Decisions based on garbage in, gospel out could be catastrophic and result in a high cost,” he warned. “It should also not be assumed that we have the right data at hand: existing data sets need to be supplemented.”

This, of course, will not come cheap. To this end, Di Giammarino underlined the need for investment in infrastructure by both supervisors and suggested that tariff models are needed to be able to fund the establishment of new data structures. A part of this task will also be related to ensuring that the data is of good quality via standardisation and metrics being put in place, which could then form a part of an “agile and adaptive monitoring system”.

The barriers to achieving such a system are numerous and Di Giammarino grouped them into five main categories related to: disclosure; governance; cost and commercial issues; operating model objectives; and harmonisation. These concerns could also be applied to the push for a centralised and mandated reference data utility for the industry. Data privacy and the lack of agreed standards across the industry, for example, could potentially pose a significant problem for both systemic risk monitoring and the establishment of a utility.

Overall, he promoted the discussion and agreement by the industry and the regulatory community of then end to end design requirements for a systemic risk monitoring system. He noted that appropriate bodies need to be identified and made accountable for delivering the roadmap. The questions that need to be asked during this process are: “For what purpose should a control framework exist? Who will be accountable for its operation? What functionality is required? What metrics and linkages are needed to operate? What data requirements need to be met in terms and quantity and quality of data (including how much historical data should be retained)? And what are the people, processes and technologies required to make it all happen?

Related content

WEBINAR

Recorded Webinar: Best Practices for Integrated Regulatory Reporting Across Multiple Jurisdictions

The regulatory reporting obligations of financial institutions have mushroomed in scale over the past decade, leaving firms facing a raft of different requirements to provide increasingly granular metrics on their transaction, valuation and collateral data to a number of regulatory authorities. While many of these reports draw from the same core data set, the nuanced differences...

BLOG

QuantHouse Automates Global Operations with Cloud-Based Robot Agents

QuantHouse, a provider of end-to-end systematic trading solutions including market data services, algo trading platform and infrastructure products, has completed the first phase of its infrastructure process automation programme using cloud-based robot agents. It remains crucial during times of uncertainty and volatility that infrastructures operate reliably, efficiently and have the necessary scale to meet ever-growing...

EVENT

RegTech Summit Virtual

The RegTech Summit Virtual which took place in June 2020 was a huge success with over 1,100 delegates registered. We are currently working on our plans for 2021 and we hope to be back with an in-person event. Whatever the future holds you can guarantee our 2021 event will be back with an exceptional guest speaker line up of Regtech practitioners, regulators, start-ups and solution providers to collaborate and discuss innovative and effective approaches for building a better regulatory environment. Can't wait until 2021? make sure you sign up to our RegTech Summit Virtual, November 2020. More info...

GUIDE

Regulatory Data Handbook 2020/2021 – Eighth Edition

This eighth edition of A-Team Group’s Regulatory Data Handbook is a ‘must-have’ for capital markets participants during this period of unprecedented change. Available free of charge, it profiles every regulation that impacts capital markets data management practices giving you: A detailed overview of each regulation with key dates, data and data management implications, links to...