About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

The seven deadly sins versus the “truth”

Subscribe to our newsletter

Unusually for a financial services conference, this year’s FIMA focused on the negative aspects of capitalism; the topic of the seven deadly sins kept cropping up throughout the sessions – including greed, sloth and gluttony (well, at the buffet table at the very least…).

However, these aspects of the industry were not seen in a wholly negative light, from the data management budgeting perspective at any rate. For example, Keith Hale, executive vice president and co-founder of Netik, opened the second day of FIMA 2008 by quoting Warren Buffet and highlighting that greed and fear have meant a rise in the priority of data to financial institutions. “Financial markets are either driven by greed or fear, according to Warren Buffet, and it seems that recent times have flipped the driver from greed to fear. This fear is good for data management projects as they are more likely to get funding due to regulatory pressure,” Hale told delegates.

In the current economic climate, the industry will witness more of a focus on a reduction in operational costs, a greater focus on automation and on getting reference data right, Hale continued. “Risk management and regulatory requirements are adding pressure on institutions to deal with their data issues. Market consolidation at the pace it is happening today is also a contributing factor as firms need to reduce duplication of processes to reduce costs,” he explained. Greed and sloth cropped up the next day during Graeme Austin, product management director at Xtrakter’s session.

He said these two deadly sins are what have got the industry into the position it is today and said that as a result, firms are looking for a more integrated approach to market and reference data to get a better handle on risk. The only way to tackle this risk and fear is to achieve a version of “truth” via better data management, Peter Serenita, chief data officer at JPMorgan Securities Services, told delegates. Serenita philosophised about the meaning of the word ‘truth’ in a data management context and came to the conclusion that although it is not realistic to hope for 100% accuracy of data, 100% of the time, this should not stop data management teams from striving to achieve a high level of quality data across an institution.

The success of a data management project should not be measured on achieving an idealistic version of the truth, it should be focused on data integration and downstream impacts, he explained. This should be attempted with some degree of pragmatism, according to Serenita, which essentially means that data is transformed into a format that can be integrated into downstream systems. “Legacy systems make up around 90% of financial institutions’ back offices and this is likely to continue indefinitely as today’s new systems are tomorrow’s legacy systems,” he added.

JPMorgan maintains a golden copy and each downstream system maintains its own interpretation of that data, he told delegates. The central data management team therefore produces the golden copy and keeps that data, while the downstream systems take that golden copy and translate it for their own purposes. Serenita concluded by urging delegates to view technology as an enabler rather than the end game: “The most important thing is to understand your data and how it is used.”

It seems that Reference Data Review readers are also quite keen on the centralisation of reference data, but on a wider scale. According to our recent poll, 74% of readers are keen for the introduction of a centralised utility for reference data, although only 22% think this can be achieved next year. This should be good news for DClear, the wholly owned subsidiary of the investment arm of the Dubai International Financial Centre (DIFC), which released plans for its reference data utility in May this year. We’ll certainly be monitoring its progress over 2009 very carefully to see whether the 22% are correct in their assumptions.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Navigating a Complex World: Best Data Practices in Sanctions Screening

As rising geopolitical uncertainty prompts an intensification in the complexity and volume of global economic and financial sanctions, banks and financial institutions are faced with a daunting set of new compliance challenges. The risk of inadvertently engaging with sanctioned securities has never been higher and the penalties for doing so are harsh. Traditional sanctions screening...

BLOG

Video: From Silos to Strategy — Rocket Software’s Michael Curry on the Data Maturity Playbook

According to Michael Curry, Rocket Software’s President of Data Modernisation – data management has grown up. The job now isn’t to just accumulate and store data, but to see it, trust it, and use it regardless of where it lives. Leading teams now map end-to-end data flows, enforce shared definitions, and assign clear ownership so...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...