About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

JPMorgan’s Serenita Debates the Issue of Truth in the Data Management World

Subscribe to our newsletter

The success of a data management project should not be measured on achieving an idealistic version of the “truth”, it should be focused on data integration and downstream impacts, said Peter Serenita, chief data officer at JPMorgan Securities Services.

“Although the integrity and accuracy of the data is important, institutions must focus on distributing and integrating that data into downstream systems,” he told FIMA 2008 delegates this morning. “Legacy systems tend to cause problems and centralised teams need to work closely with downstream teams to avoid misinterpretation of data.”

Serenita explained that it is not realistic to hope for 100% accuracy of data, 100% of the time but this should not stop data management teams from striving to achieve a high level of quality data across an institution. “It should be looked at as a business problem rather than a data problem – we need to look at how the data that we have spent so much time cleansing and protecting is being used,” he said.

This should be attempted with some degree of pragmatism, according to Serenita, which essentially means that data is transformed into a format that can be integrated into downstream systems. “Legacy systems make up around 90% of financial institutions’ back offices and this is likely to continue indefinitely as today’s new systems are tomorrow’s legacy systems,” he added.

JPMorgan maintains a golden copy and each downstream system maintains its own interpretation of that data, he told delegates. The central data management team therefore produces the golden copy and keeps that data, while the downstream systems take that golden copy and translate it for their own purposes. “This has a range of pros and cons; the major downside is a lack of an end to end view of data,” he explained.

Serenita concluded by urging delegates to view technology as an enabler rather than the end game: “The most important thing is to understand your data and how it is used.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and reduce costs. Use cases of specific importance to the finance sector, such as data...

BLOG

Revolutionising the Power of Corporate Actions Data

By Tim Lind, Managing Director of DTCC Data Services. We live in a deeply networked society. Information sharing has moved from primarily one-to-one communication to global networks where data and information is shared instantly and broadly. Across financial services, many organisations continue to advance their communications approach; however, integral corporate actions event data, such as...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Entity Data Management Handbook – Third Edition

Welcome to the third edition of the Entity Data Management Handbook which is available for free download. In this updated edition we delve into the role entity data plays in the smooth running of financial institutions and capital markets, the challenges of attaining high quality data, and various aspects, approaches and technologies involved in managing...