The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

The seven deadly sins versus the “truth”

Unusually for a financial services conference, this year’s FIMA focused on the negative aspects of capitalism; the topic of the seven deadly sins kept cropping up throughout the sessions – including greed, sloth and gluttony (well, at the buffet table at the very least…).

However, these aspects of the industry were not seen in a wholly negative light, from the data management budgeting perspective at any rate. For example, Keith Hale, executive vice president and co-founder of Netik, opened the second day of FIMA 2008 by quoting Warren Buffet and highlighting that greed and fear have meant a rise in the priority of data to financial institutions. “Financial markets are either driven by greed or fear, according to Warren Buffet, and it seems that recent times have flipped the driver from greed to fear. This fear is good for data management projects as they are more likely to get funding due to regulatory pressure,” Hale told delegates.

In the current economic climate, the industry will witness more of a focus on a reduction in operational costs, a greater focus on automation and on getting reference data right, Hale continued. “Risk management and regulatory requirements are adding pressure on institutions to deal with their data issues. Market consolidation at the pace it is happening today is also a contributing factor as firms need to reduce duplication of processes to reduce costs,” he explained. Greed and sloth cropped up the next day during Graeme Austin, product management director at Xtrakter’s session.

He said these two deadly sins are what have got the industry into the position it is today and said that as a result, firms are looking for a more integrated approach to market and reference data to get a better handle on risk. The only way to tackle this risk and fear is to achieve a version of “truth” via better data management, Peter Serenita, chief data officer at JPMorgan Securities Services, told delegates. Serenita philosophised about the meaning of the word ‘truth’ in a data management context and came to the conclusion that although it is not realistic to hope for 100% accuracy of data, 100% of the time, this should not stop data management teams from striving to achieve a high level of quality data across an institution.

The success of a data management project should not be measured on achieving an idealistic version of the truth, it should be focused on data integration and downstream impacts, he explained. This should be attempted with some degree of pragmatism, according to Serenita, which essentially means that data is transformed into a format that can be integrated into downstream systems. “Legacy systems make up around 90% of financial institutions’ back offices and this is likely to continue indefinitely as today’s new systems are tomorrow’s legacy systems,” he added.

JPMorgan maintains a golden copy and each downstream system maintains its own interpretation of that data, he told delegates. The central data management team therefore produces the golden copy and keeps that data, while the downstream systems take that golden copy and translate it for their own purposes. Serenita concluded by urging delegates to view technology as an enabler rather than the end game: “The most important thing is to understand your data and how it is used.”

It seems that Reference Data Review readers are also quite keen on the centralisation of reference data, but on a wider scale. According to our recent poll, 74% of readers are keen for the introduction of a centralised utility for reference data, although only 22% think this can be achieved next year. This should be good news for DClear, the wholly owned subsidiary of the investment arm of the Dubai International Financial Centre (DIFC), which released plans for its reference data utility in May this year. We’ll certainly be monitoring its progress over 2009 very carefully to see whether the 22% are correct in their assumptions.

Related content

WEBINAR

Recorded Webinar: Getting ready for Sustainable Finance Disclosure Regulation (SFDR) and ESG – what action should asset managers be taking now?

Interest in Environmental, Social and Governance (ESG) investment has exploded in recent years, bringing with it regulation and a requirement for buy-side firms to develop ESG strategies and meet disclosure obligations. The sell-side can help here by integrating ESG data with traditional financial information, although the compliance burden remains with asset managers. The EU Sustainable...

BLOG

Etrading Releases Digital Token Identifier Registration Service, Initiates Use of Standards Across Digital Token Asset Class

Etrading Software has started Digital Token Identifier (DTI) registration following go live of the DTI registry in July 2021 and recent publication of the ISO standard for digital assets, ISO 24165. With the standard in place, early work by the registry includes coverage of the top 100 cryptocurrencies by market capitalisation – ultimately, all digital...

EVENT

ESG Regulation, Reporting & Data Management Summit (Redirected)

This summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

ESG Handbook 2021

A-Team Group’s ESG Handbook 2021 is a ‘must read’ for all capital markets participants, data vendors and solutions providers involved in Environmental, Social and Governance (ESG) investing and product development. It includes extensive coverage of all elements of ESG, from an initial definition and why ESG is important, to existing and emerging regulations, data challenges...