About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Regulation is Driving Force Behind Data Management Change, Agree EDM Council’s Atkin and JWG’s Di Giammarino

Subscribe to our newsletter

They may differ in their opinions in how to go about effecting change, but EDM Council’s managing director Mike Atkin and JWG’s CEO PJ Di Giammarino agreed that the regulatory environment is forcing firms to re-evaluate their data management systems. Both Atkin and Di Giammarino told delegates to TSAM in London earlier this week that the regulatory community is delegating the responsibility for improving data quality to the industry in order to enable them to better track systemic risk.

Atkin, who is a proponent of the move to establish the National Institute of Finance (NIF) in the US in order to achieve greater data standardisation, highlighted three big regulatory drivers from the last 10 years that he believes have focused the industry’s attention on data management. “The first was terrorism and its impact on anti-money laundering and know your customer legislation. The second was events such as the fall of Long Term Capital Management (LTCM) and Enron, which led to a focus on entity relationships within the financial supply chain. The most recent was the credit crisis and the fall of Lehman Brothers, which have led to a strong focus on systemic risk and analysis,” he explained.

Di Giammarino, who has remained sceptical about the achievability of the NIF, agreed that the regulatory community has received accountability for monitoring risk at a systemic level and added that this community is currently lacking in the ability to control data quality itself, let alone police the industry.

Atkin seconded this notion: “They own systemic risk but at the moment they seem to have no clear idea what that means, as their own data systems are held together by wire and chewing gum. This is why they need to export the problem onto the industry, so that the industry provides the regulators with the comparable data required to monitor systemic risk.”

He then elaborated on some of the work the EDM Council has recently been engaged in to get regulators talking constructively about data management. “Three weeks ago we had a meeting with a group of US regulators and we invited six people but 24 turned up,” he noted. Atkin believes this demonstrates the serious manner in which the regulatory community is treating the issue of data, along with the recent NIF Act that was submitted for consideration by senator Jack Reed.

“The logical extension to this is that transparency is driving the data quality argument forward,” said Di Giammarino. “This is not about a mythical data fairy coming down to fix the industry’s problems, it is rather about improving risk management and gaining competitive advantage.”

Di Giammarino is a proponent of the articulation of a better business case around data management and told delegates that a very specific mandatory role needs to be defined for data at a business level in order for firms to be able to cope with the “regulatory tsunami” and its data management impacts. The fear factor is driving investment in data management on the “staying out of jail” principle, he added.

It was at this point that the two speakers’ views diverged, however, as Atkin suggested that Di Giammarino’s focus was on the impacts of individual regulations rather than the other “side to the story” that he feels is more important: “the data foundations”. Atkin suggested that the industry needs to start thinking about data as infrastructure and focusing on getting “the basics” of data such as entity and instrument data right.

Di Giammarino, on the other hand, discussed the “first mile problem” of making the business understand the requirements and the benefits of fixing the data issues.

However, both concluded that greater communication and getting the industry engaged in discussing the data issues was important going forward.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to maximise the use of data standards and identifiers beyond compliance and in the interests of the business

Date: 18 July 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data standards and identifiers have become common currency in regulatory compliance, bringing with them improved transparency, efficiency and data quality in reporting. They also contribute to automation. But their value does not end here, with data standards and identifiers...

BLOG

Dow Jones Risk & Compliance Deploys Generative AI to Transform Due Diligence

Dow Jones Risk & Compliance has launched an AI-powered research platform to help clients reduce the time and effort in building investigative due diligence reports from multiple sources. The new offering aims to reshape compliance workflows, creating an additional layer of investigation that can be deployed at scale. Dow Jones Integrity Check is an automated...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Data Lineage Handbook

Data lineage has become a critical concern for data managers in capital markets as it is key to both regulatory compliance and business opportunity. The regulatory requirement for data lineage kicked in with BCBS 239 in 2016 and has since been extended to many other regulations that oblige firms to provide transparency and a data...