The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

JPMorgan’s Serenita Debates the Issue of Truth in the Data Management World

The success of a data management project should not be measured on achieving an idealistic version of the “truth”, it should be focused on data integration and downstream impacts, said Peter Serenita, chief data officer at JPMorgan Securities Services.

“Although the integrity and accuracy of the data is important, institutions must focus on distributing and integrating that data into downstream systems,” he told FIMA 2008 delegates this morning. “Legacy systems tend to cause problems and centralised teams need to work closely with downstream teams to avoid misinterpretation of data.”

Serenita explained that it is not realistic to hope for 100% accuracy of data, 100% of the time but this should not stop data management teams from striving to achieve a high level of quality data across an institution. “It should be looked at as a business problem rather than a data problem – we need to look at how the data that we have spent so much time cleansing and protecting is being used,” he said.

This should be attempted with some degree of pragmatism, according to Serenita, which essentially means that data is transformed into a format that can be integrated into downstream systems. “Legacy systems make up around 90% of financial institutions’ back offices and this is likely to continue indefinitely as today’s new systems are tomorrow’s legacy systems,” he added.

JPMorgan maintains a golden copy and each downstream system maintains its own interpretation of that data, he told delegates. The central data management team therefore produces the golden copy and keeps that data, while the downstream systems take that golden copy and translate it for their own purposes. “This has a range of pros and cons; the major downside is a lack of an end to end view of data,” he explained.

Serenita concluded by urging delegates to view technology as an enabler rather than the end game: “The most important thing is to understand your data and how it is used.”

Related content

WEBINAR

Recorded Webinar: Moving Regulatory Data to the Cloud: A Use Case Discussion

Migrating risk and regulatory reporting data to the cloud is turning out to be one of the hottest trends for 2020 – but not everyone is getting it right, and there are pitfalls to be avoided as well positive outcomes to be achieved. Especially in today’s remote working world, financial firms are facing the challenge...

BLOG

GoldenSource Ushers Reference and Pricing Data into the Front Office with Quant Workbench

Extracting value from data is a priority for financial institutions as the business looks to increase efficiency, reduce costs, identify new opportunities and gain competitive advantage. Some source in-house tools to improve the quality and accessibility of internal and external data, others look to third-parties for solutions. A new tool from GoldenSource, Quant Workbench, brings...

EVENT

RegTech Summit Virtual

The RegTech Summit Virtual is a global online event that will be held in June 2021 with an exceptional guest speaker line up of RegTech practitioners, regulators, start-ups and solution providers to collaborate and discuss innovative and effective approaches for building a better regulatory environment.

GUIDE

Corporate Actions

Corporate actions has been a popular topic of discussion over the last few months, with the DTCC’s plans for XBRL and ISO interoperability, as well as the launch of Swift’s new self-testing service for corporate actions messaging, STaQS, among others. However, it has not been a good start to the year for many of the...