About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

JPMorgan’s Serenita Debates the Issue of Truth in the Data Management World

Subscribe to our newsletter

The success of a data management project should not be measured on achieving an idealistic version of the “truth”, it should be focused on data integration and downstream impacts, said Peter Serenita, chief data officer at JPMorgan Securities Services.

“Although the integrity and accuracy of the data is important, institutions must focus on distributing and integrating that data into downstream systems,” he told FIMA 2008 delegates this morning. “Legacy systems tend to cause problems and centralised teams need to work closely with downstream teams to avoid misinterpretation of data.”

Serenita explained that it is not realistic to hope for 100% accuracy of data, 100% of the time but this should not stop data management teams from striving to achieve a high level of quality data across an institution. “It should be looked at as a business problem rather than a data problem – we need to look at how the data that we have spent so much time cleansing and protecting is being used,” he said.

This should be attempted with some degree of pragmatism, according to Serenita, which essentially means that data is transformed into a format that can be integrated into downstream systems. “Legacy systems make up around 90% of financial institutions’ back offices and this is likely to continue indefinitely as today’s new systems are tomorrow’s legacy systems,” he added.

JPMorgan maintains a golden copy and each downstream system maintains its own interpretation of that data, he told delegates. The central data management team therefore produces the golden copy and keeps that data, while the downstream systems take that golden copy and translate it for their own purposes. “This has a range of pros and cons; the major downside is a lack of an end to end view of data,” he explained.

Serenita concluded by urging delegates to view technology as an enabler rather than the end game: “The most important thing is to understand your data and how it is used.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unpacking Stablecoin Challenges for Financial Institutions

The stablecoin market is experiencing unprecedented growth, driven by emerging regulatory clarity, technological maturity, and rising global demand for a faster, more secure financial infrastructure. But with opportunity comes complexity, and a host of challenges that financial institutions need to address before they can unlock the promise of a more streamlined financial transaction ecosystem. These...

BLOG

S&P Global Data via Cloud: Unlocking Real-Time, Scalable Insights with Snowflake and Databricks Delta Sharing

As organisations accelerate their cloud migration strategies to manage growing volumes of structured and unstructured data, demand is rising for secure, real-time, cloud-native access to trusted datasets. Leveraging Snowflake and Databricks Delta Sharing, S&P Global provides a scalable, agile foundation that allows organizations to directly access and query S&P Global and curated third-party datasets without...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Trading Regulations Handbook 2022

Welcome to the third edition of A-Team Group’s Trading Regulations Handbook, a publication designed to help you gain a full understanding of regulations that have an impact on your trading operations, data and technology. The handbook provides details of each regulation and its requirements, as well as ‘at-a-glance’ summaries, regulatory timelines and compliance deadlines, and...