They may differ in their opinions in how to go about effecting change, but EDM Council’s managing director Mike Atkin and JWG’s CEO PJ Di Giammarino agreed that the regulatory environment is forcing firms to re-evaluate their data management systems. Both Atkin and Di Giammarino told delegates to TSAM in London earlier this week that the regulatory community is delegating the responsibility for improving data quality to the industry in order to enable them to better track systemic risk.
Atkin, who is a proponent of the move to establish the National Institute of Finance (NIF) in the US in order to achieve greater data standardisation, highlighted three big regulatory drivers from the last 10 years that he believes have focused the industry’s attention on data management. “The first was terrorism and its impact on anti-money laundering and know your customer legislation. The second was events such as the fall of Long Term Capital Management (LTCM) and Enron, which led to a focus on entity relationships within the financial supply chain. The most recent was the credit crisis and the fall of Lehman Brothers, which have led to a strong focus on systemic risk and analysis,” he explained.
Di Giammarino, who has remained sceptical about the achievability of the NIF, agreed that the regulatory community has received accountability for monitoring risk at a systemic level and added that this community is currently lacking in the ability to control data quality itself, let alone police the industry.
Atkin seconded this notion: “They own systemic risk but at the moment they seem to have no clear idea what that means, as their own data systems are held together by wire and chewing gum. This is why they need to export the problem onto the industry, so that the industry provides the regulators with the comparable data required to monitor systemic risk.”
He then elaborated on some of the work the EDM Council has recently been engaged in to get regulators talking constructively about data management. “Three weeks ago we had a meeting with a group of US regulators and we invited six people but 24 turned up,” he noted. Atkin believes this demonstrates the serious manner in which the regulatory community is treating the issue of data, along with the recent NIF Act that was submitted for consideration by senator Jack Reed.
“The logical extension to this is that transparency is driving the data quality argument forward,” said Di Giammarino. “This is not about a mythical data fairy coming down to fix the industry’s problems, it is rather about improving risk management and gaining competitive advantage.”
Di Giammarino is a proponent of the articulation of a better business case around data management and told delegates that a very specific mandatory role needs to be defined for data at a business level in order for firms to be able to cope with the “regulatory tsunami” and its data management impacts. The fear factor is driving investment in data management on the “staying out of jail” principle, he added.
It was at this point that the two speakers’ views diverged, however, as Atkin suggested that Di Giammarino’s focus was on the impacts of individual regulations rather than the other “side to the story” that he feels is more important: “the data foundations”. Atkin suggested that the industry needs to start thinking about data as infrastructure and focusing on getting “the basics” of data such as entity and instrument data right.
Di Giammarino, on the other hand, discussed the “first mile problem” of making the business understand the requirements and the benefits of fixing the data issues.
However, both concluded that greater communication and getting the industry engaged in discussing the data issues was important going forward.
Subscribe to our newsletter