About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A Rethink, Redesign and Retool is Needed for Systemic Risk Monitoring, Contends JWG’s Di Giammarino

Subscribe to our newsletter

In order to be able to properly monitor systemic risk, the industry first needs to understand the gaps in the current data framework and then rethink, redesign and retool the method of controlling “the system”, according to PJ Di Giammarino, CEO of JWG. Speaking at the FS Club event on systemic risk last week, Di Giammarino noted that before the regulatory community jumps into mandating data standards for systemic risk monitoring purposes, for example, it first needs to consult with the industry to define a specific roadmap for change.

Following on from the JWG event on funds transfer pricing earlier in the day, which also featured discussions on risk data challenges, the FS Club panel was charged with debating the tools needed to monitor systemic risk across the industry, including reference data standards for instrument and entity identification. Di Giammarino contended that the industry is only beginning to understand what is required to monitor systemic risk but may be jumping ahead of itself without first considering the practicalities.

“There has been a lot of intellectual debate and policy points, but very little in the way of discussion about the practicalities of monitoring systemic risk. There are also significant disconnects across the G20 with regards to approaches to information management that could cause regulatory further silos,” he explained. “There needs to be a complete rethink in the way we approach systemic risk.”

Di Giammarino believes there are three steps that need to be taken first before a roadmap for change can be drawn up: “rethink, redesign and retool”. A rethink is needed to be able to align macro and micro views, although this will be a difficult and costly process, he explained. Regulatory information silos are difficult to locate and to link today (just look at transaction reporting across Europe for an example) and therefore only a partial view of system data is available; hence a “redesign” is required. A retool would therefore involve a global approach to aggregating and interpreting the correct data, he added.

The current assumptions about how to manage risk data need to be fundamentally challenged, according to Di Giammarino, such as appreciating the consequences of getting it wrong. “Decisions based on garbage in, gospel out could be catastrophic and result in a high cost,” he warned. “It should also not be assumed that we have the right data at hand: existing data sets need to be supplemented.”

This, of course, will not come cheap. To this end, Di Giammarino underlined the need for investment in infrastructure by both supervisors and suggested that tariff models are needed to be able to fund the establishment of new data structures. A part of this task will also be related to ensuring that the data is of good quality via standardisation and metrics being put in place, which could then form a part of an “agile and adaptive monitoring system”.

The barriers to achieving such a system are numerous and Di Giammarino grouped them into five main categories related to: disclosure; governance; cost and commercial issues; operating model objectives; and harmonisation. These concerns could also be applied to the push for a centralised and mandated reference data utility for the industry. Data privacy and the lack of agreed standards across the industry, for example, could potentially pose a significant problem for both systemic risk monitoring and the establishment of a utility.

Overall, he promoted the discussion and agreement by the industry and the regulatory community of then end to end design requirements for a systemic risk monitoring system. He noted that appropriate bodies need to be identified and made accountable for delivering the roadmap. The questions that need to be asked during this process are: “For what purpose should a control framework exist? Who will be accountable for its operation? What functionality is required? What metrics and linkages are needed to operate? What data requirements need to be met in terms and quantity and quality of data (including how much historical data should be retained)? And what are the people, processes and technologies required to make it all happen?

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

23 September 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance....

BLOG

A-Team Insight Announces RegTech Award Winners as APAC Navigates Compliance Complexity

A-Team Group is proud to reveal the winners of our inaugural Capital Markets Technology APAC Awards 2025, recognising the firms and solutions demonstrating exceptional innovation across the Asia Pacific region. Alongside this announcement, we have launched our in-depth annual report, “The State of Capital Markets Technology in Asia Pacific 2025”, which examines the key trends...

EVENT

TradingTech Summit London

Now in its 14th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...