About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A Rethink, Redesign and Retool is Needed for Systemic Risk Monitoring, Contends JWG’s Di Giammarino

Subscribe to our newsletter

In order to be able to properly monitor systemic risk, the industry first needs to understand the gaps in the current data framework and then rethink, redesign and retool the method of controlling “the system”, according to PJ Di Giammarino, CEO of JWG. Speaking at the FS Club event on systemic risk last week, Di Giammarino noted that before the regulatory community jumps into mandating data standards for systemic risk monitoring purposes, for example, it first needs to consult with the industry to define a specific roadmap for change.

Following on from the JWG event on funds transfer pricing earlier in the day, which also featured discussions on risk data challenges, the FS Club panel was charged with debating the tools needed to monitor systemic risk across the industry, including reference data standards for instrument and entity identification. Di Giammarino contended that the industry is only beginning to understand what is required to monitor systemic risk but may be jumping ahead of itself without first considering the practicalities.

“There has been a lot of intellectual debate and policy points, but very little in the way of discussion about the practicalities of monitoring systemic risk. There are also significant disconnects across the G20 with regards to approaches to information management that could cause regulatory further silos,” he explained. “There needs to be a complete rethink in the way we approach systemic risk.”

Di Giammarino believes there are three steps that need to be taken first before a roadmap for change can be drawn up: “rethink, redesign and retool”. A rethink is needed to be able to align macro and micro views, although this will be a difficult and costly process, he explained. Regulatory information silos are difficult to locate and to link today (just look at transaction reporting across Europe for an example) and therefore only a partial view of system data is available; hence a “redesign” is required. A retool would therefore involve a global approach to aggregating and interpreting the correct data, he added.

The current assumptions about how to manage risk data need to be fundamentally challenged, according to Di Giammarino, such as appreciating the consequences of getting it wrong. “Decisions based on garbage in, gospel out could be catastrophic and result in a high cost,” he warned. “It should also not be assumed that we have the right data at hand: existing data sets need to be supplemented.”

This, of course, will not come cheap. To this end, Di Giammarino underlined the need for investment in infrastructure by both supervisors and suggested that tariff models are needed to be able to fund the establishment of new data structures. A part of this task will also be related to ensuring that the data is of good quality via standardisation and metrics being put in place, which could then form a part of an “agile and adaptive monitoring system”.

The barriers to achieving such a system are numerous and Di Giammarino grouped them into five main categories related to: disclosure; governance; cost and commercial issues; operating model objectives; and harmonisation. These concerns could also be applied to the push for a centralised and mandated reference data utility for the industry. Data privacy and the lack of agreed standards across the industry, for example, could potentially pose a significant problem for both systemic risk monitoring and the establishment of a utility.

Overall, he promoted the discussion and agreement by the industry and the regulatory community of then end to end design requirements for a systemic risk monitoring system. He noted that appropriate bodies need to be identified and made accountable for delivering the roadmap. The questions that need to be asked during this process are: “For what purpose should a control framework exist? Who will be accountable for its operation? What functionality is required? What metrics and linkages are needed to operate? What data requirements need to be met in terms and quantity and quality of data (including how much historical data should be retained)? And what are the people, processes and technologies required to make it all happen?

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Navigating a Complex World: Best Data Practices in Sanctions Screening

As rising geopolitical uncertainty prompts an intensification in the complexity and volume of global economic and financial sanctions, banks and financial institutions are faced with a daunting set of new compliance challenges. The risk of inadvertently engaging with sanctioned securities has never been higher and the penalties for doing so are harsh. Traditional sanctions screening...

BLOG

Data Should be Regarded as a Strategic Asset: Webinar Preview

The growing acceptance of data as an enterprise-wide necessity is gaining ground, especially within capital markets, where rapid changes in the global economy and increasing pressure on bottom lines is prompting a rethink of business models. The need for greater data and product delivery speeds, the demand for more efficient workflows to reduce costs and...

EVENT

ExchangeTech Summit London

A-Team Group, organisers of the TradingTech Summits, are pleased to announce the inaugural ExchangeTech Summit London on May 14th 2026. This dedicated forum brings together operators of exchanges, alternative execution venues and digital asset platforms with the ecosystem of vendors driving the future of matching engines, surveillance and market access.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...