About a-team Marketing Services

A-Team Insight Blogs

Regulation is Driving Force Behind Data Management Change, Agree EDM Council’s Atkin and JWG’s Di Giammarino

Subscribe to our newsletter

They may differ in their opinions in how to go about effecting change, but EDM Council’s managing director Mike Atkin and JWG’s CEO PJ Di Giammarino agreed that the regulatory environment is forcing firms to re-evaluate their data management systems. Both Atkin and Di Giammarino told delegates to TSAM in London earlier this week that the regulatory community is delegating the responsibility for improving data quality to the industry in order to enable them to better track systemic risk.

Atkin, who is a proponent of the move to establish the National Institute of Finance (NIF) in the US in order to achieve greater data standardisation, highlighted three big regulatory drivers from the last 10 years that he believes have focused the industry’s attention on data management. “The first was terrorism and its impact on anti-money laundering and know your customer legislation. The second was events such as the fall of Long Term Capital Management (LTCM) and Enron, which led to a focus on entity relationships within the financial supply chain. The most recent was the credit crisis and the fall of Lehman Brothers, which have led to a strong focus on systemic risk and analysis,” he explained.

Di Giammarino, who has remained sceptical about the achievability of the NIF, agreed that the regulatory community has received accountability for monitoring risk at a systemic level and added that this community is currently lacking in the ability to control data quality itself, let alone police the industry.

Atkin seconded this notion: “They own systemic risk but at the moment they seem to have no clear idea what that means, as their own data systems are held together by wire and chewing gum. This is why they need to export the problem onto the industry, so that the industry provides the regulators with the comparable data required to monitor systemic risk.”

He then elaborated on some of the work the EDM Council has recently been engaged in to get regulators talking constructively about data management. “Three weeks ago we had a meeting with a group of US regulators and we invited six people but 24 turned up,” he noted. Atkin believes this demonstrates the serious manner in which the regulatory community is treating the issue of data, along with the recent NIF Act that was submitted for consideration by senator Jack Reed.

“The logical extension to this is that transparency is driving the data quality argument forward,” said Di Giammarino. “This is not about a mythical data fairy coming down to fix the industry’s problems, it is rather about improving risk management and gaining competitive advantage.”

Di Giammarino is a proponent of the articulation of a better business case around data management and told delegates that a very specific mandatory role needs to be defined for data at a business level in order for firms to be able to cope with the “regulatory tsunami” and its data management impacts. The fear factor is driving investment in data management on the “staying out of jail” principle, he added.

It was at this point that the two speakers’ views diverged, however, as Atkin suggested that Di Giammarino’s focus was on the impacts of individual regulations rather than the other “side to the story” that he feels is more important: “the data foundations”. Atkin suggested that the industry needs to start thinking about data as infrastructure and focusing on getting “the basics” of data such as entity and instrument data right.

Di Giammarino, on the other hand, discussed the “first mile problem” of making the business understand the requirements and the benefits of fixing the data issues.

However, both concluded that greater communication and getting the industry engaged in discussing the data issues was important going forward.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: In data we trust – How to ensure high quality data to power AI

Artificial intelligence is increasingly powering financial institutions’ processes and workflows, encompassing all parts of the enterprise from front-office to the back-office. As organisations seek to gain a competitive edge, they are trialling the technology in variety of ways to streamline and empower multiple use cases. Some are further than others along the path to achieving...

BLOG

Data’s Role in AI Transition and Value Creation: Data Management Summit London Preview

The rapid adoption of artificial intelligence by financial institutions has required a heavy data management uplift as organisations have upgraded their systems to incorporate the new technology. It has also provided greater opportunity to squeeze even more value from data by enabling its efficient deployment across enterprises. Just how companies manage data for AI to...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Entity Data Management Handbook – Sixth Edition

High-profile and punitive penalties handed out to large financial institutions for non-compliance with Anti-Money Laundering (AML) and Know Your Customer (KYC) regulations have catapulted entity data management up the business agenda. So, too, have industry and government reports on the staggering sums of money laundered on a global basis. Less apparent, but equally important, are...