About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A Rethink, Redesign and Retool is Needed for Systemic Risk Monitoring, Contends JWG’s Di Giammarino

Subscribe to our newsletter

In order to be able to properly monitor systemic risk, the industry first needs to understand the gaps in the current data framework and then rethink, redesign and retool the method of controlling “the system”, according to PJ Di Giammarino, CEO of JWG. Speaking at the FS Club event on systemic risk last week, Di Giammarino noted that before the regulatory community jumps into mandating data standards for systemic risk monitoring purposes, for example, it first needs to consult with the industry to define a specific roadmap for change.

Following on from the JWG event on funds transfer pricing earlier in the day, which also featured discussions on risk data challenges, the FS Club panel was charged with debating the tools needed to monitor systemic risk across the industry, including reference data standards for instrument and entity identification. Di Giammarino contended that the industry is only beginning to understand what is required to monitor systemic risk but may be jumping ahead of itself without first considering the practicalities.

“There has been a lot of intellectual debate and policy points, but very little in the way of discussion about the practicalities of monitoring systemic risk. There are also significant disconnects across the G20 with regards to approaches to information management that could cause regulatory further silos,” he explained. “There needs to be a complete rethink in the way we approach systemic risk.”

Di Giammarino believes there are three steps that need to be taken first before a roadmap for change can be drawn up: “rethink, redesign and retool”. A rethink is needed to be able to align macro and micro views, although this will be a difficult and costly process, he explained. Regulatory information silos are difficult to locate and to link today (just look at transaction reporting across Europe for an example) and therefore only a partial view of system data is available; hence a “redesign” is required. A retool would therefore involve a global approach to aggregating and interpreting the correct data, he added.

The current assumptions about how to manage risk data need to be fundamentally challenged, according to Di Giammarino, such as appreciating the consequences of getting it wrong. “Decisions based on garbage in, gospel out could be catastrophic and result in a high cost,” he warned. “It should also not be assumed that we have the right data at hand: existing data sets need to be supplemented.”

This, of course, will not come cheap. To this end, Di Giammarino underlined the need for investment in infrastructure by both supervisors and suggested that tariff models are needed to be able to fund the establishment of new data structures. A part of this task will also be related to ensuring that the data is of good quality via standardisation and metrics being put in place, which could then form a part of an “agile and adaptive monitoring system”.

The barriers to achieving such a system are numerous and Di Giammarino grouped them into five main categories related to: disclosure; governance; cost and commercial issues; operating model objectives; and harmonisation. These concerns could also be applied to the push for a centralised and mandated reference data utility for the industry. Data privacy and the lack of agreed standards across the industry, for example, could potentially pose a significant problem for both systemic risk monitoring and the establishment of a utility.

Overall, he promoted the discussion and agreement by the industry and the regulatory community of then end to end design requirements for a systemic risk monitoring system. He noted that appropriate bodies need to be identified and made accountable for delivering the roadmap. The questions that need to be asked during this process are: “For what purpose should a control framework exist? Who will be accountable for its operation? What functionality is required? What metrics and linkages are needed to operate? What data requirements need to be met in terms and quantity and quality of data (including how much historical data should be retained)? And what are the people, processes and technologies required to make it all happen?

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Are Your Legacy Voice Recordings a Compliance Time Bomb?

Recent enforcement actions underscore the importance of maintaining accurate, secure and up-to-date voice and electronic communication. For some organisations, legacy voice recording systems are not at or beyond end-of-life, posing significant compliance, operational and financial risks. These outdated systems often fail to meet evolving regulatory expectations around data authenticity, retention, and accessibility. Delaying action increases...

BLOG

Theta Lake Touts First-of-its-Kind ISO Certification for AI Comms Data Trust

Data security specialist Theta Lake has been awarded trust certification for its artificial intelligence-powered compliance communications services. The designation was conferred as the company prepares to release a report that shows IT teams in financial services and other industries are facing challenges with their AI governance and security. Santa Barbara, California-based Theta Lake achieved ISO...

EVENT

Eagle Alpha Alternative Data Conference, hosted by A-Team Group

Now in its 8th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...