The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

The seven deadly sins versus the “truth”

Share article

Unusually for a financial services conference, this year’s FIMA focused on the negative aspects of capitalism; the topic of the seven deadly sins kept cropping up throughout the sessions – including greed, sloth and gluttony (well, at the buffet table at the very least…).

However, these aspects of the industry were not seen in a wholly negative light, from the data management budgeting perspective at any rate. For example, Keith Hale, executive vice president and co-founder of Netik, opened the second day of FIMA 2008 by quoting Warren Buffet and highlighting that greed and fear have meant a rise in the priority of data to financial institutions. “Financial markets are either driven by greed or fear, according to Warren Buffet, and it seems that recent times have flipped the driver from greed to fear. This fear is good for data management projects as they are more likely to get funding due to regulatory pressure,” Hale told delegates.

In the current economic climate, the industry will witness more of a focus on a reduction in operational costs, a greater focus on automation and on getting reference data right, Hale continued. “Risk management and regulatory requirements are adding pressure on institutions to deal with their data issues. Market consolidation at the pace it is happening today is also a contributing factor as firms need to reduce duplication of processes to reduce costs,” he explained. Greed and sloth cropped up the next day during Graeme Austin, product management director at Xtrakter’s session.

He said these two deadly sins are what have got the industry into the position it is today and said that as a result, firms are looking for a more integrated approach to market and reference data to get a better handle on risk. The only way to tackle this risk and fear is to achieve a version of “truth” via better data management, Peter Serenita, chief data officer at JPMorgan Securities Services, told delegates. Serenita philosophised about the meaning of the word ‘truth’ in a data management context and came to the conclusion that although it is not realistic to hope for 100% accuracy of data, 100% of the time, this should not stop data management teams from striving to achieve a high level of quality data across an institution.

The success of a data management project should not be measured on achieving an idealistic version of the truth, it should be focused on data integration and downstream impacts, he explained. This should be attempted with some degree of pragmatism, according to Serenita, which essentially means that data is transformed into a format that can be integrated into downstream systems. “Legacy systems make up around 90% of financial institutions’ back offices and this is likely to continue indefinitely as today’s new systems are tomorrow’s legacy systems,” he added.

JPMorgan maintains a golden copy and each downstream system maintains its own interpretation of that data, he told delegates. The central data management team therefore produces the golden copy and keeps that data, while the downstream systems take that golden copy and translate it for their own purposes. Serenita concluded by urging delegates to view technology as an enabler rather than the end game: “The most important thing is to understand your data and how it is used.”

It seems that Reference Data Review readers are also quite keen on the centralisation of reference data, but on a wider scale. According to our recent poll, 74% of readers are keen for the introduction of a centralised utility for reference data, although only 22% think this can be achieved next year. This should be good news for DClear, the wholly owned subsidiary of the investment arm of the Dubai International Financial Centre (DIFC), which released plans for its reference data utility in May this year. We’ll certainly be monitoring its progress over 2009 very carefully to see whether the 22% are correct in their assumptions.

Related content

WEBINAR

Recorded Webinar: Data Standards – progress and case studies

Global data standards and identifiers are essential to business growth, market stability and cost reduction – but they can be challenging to implement, while a lack of consistency across jurisdictions has presented obstacles to global take-up. However, with regulators starting to sit up and take note, the issue of data standards is coming increasingly to...

BLOG

The Perils of Inaccurate Data Capture

By Russell Jones, CEO, Fetchify (specialising in SaaS address lookup and data validation solutions). Inaccurate data can be damaging to all aspects of a business. In a 2018 survey, Gartner reported that organisations are negatively impacted by poor quality data, resulting in wasted resources and unnecessary costs to the tune of $15 million annually. Poorly...

EVENT

TradingTech Summit London

The TradingTech Summit in London brings together European senior-level decision makers in trading technology, electronic execution and trading architecture to discuss how firms can use high performance technologies to optimise trading in the new regulatory environment.

GUIDE

Regulatory Data Handbook 2020/2021 – Eighth Edition

This eighth edition of A-Team Group’s Regulatory Data Handbook is a ‘must-have’ for capital markets participants during this period of unprecedented change. Available free of charge, it profiles every regulation that impacts capital markets data management practices giving you: A detailed overview of each regulation with key dates, data and data management implications, links to...