About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

JPMorgan’s Serenita Debates the Issue of Truth in the Data Management World

Subscribe to our newsletter

The success of a data management project should not be measured on achieving an idealistic version of the “truth”, it should be focused on data integration and downstream impacts, said Peter Serenita, chief data officer at JPMorgan Securities Services.

“Although the integrity and accuracy of the data is important, institutions must focus on distributing and integrating that data into downstream systems,” he told FIMA 2008 delegates this morning. “Legacy systems tend to cause problems and centralised teams need to work closely with downstream teams to avoid misinterpretation of data.”

Serenita explained that it is not realistic to hope for 100% accuracy of data, 100% of the time but this should not stop data management teams from striving to achieve a high level of quality data across an institution. “It should be looked at as a business problem rather than a data problem – we need to look at how the data that we have spent so much time cleansing and protecting is being used,” he said.

This should be attempted with some degree of pragmatism, according to Serenita, which essentially means that data is transformed into a format that can be integrated into downstream systems. “Legacy systems make up around 90% of financial institutions’ back offices and this is likely to continue indefinitely as today’s new systems are tomorrow’s legacy systems,” he added.

JPMorgan maintains a golden copy and each downstream system maintains its own interpretation of that data, he told delegates. The central data management team therefore produces the golden copy and keeps that data, while the downstream systems take that golden copy and translate it for their own purposes. “This has a range of pros and cons; the major downside is a lack of an end to end view of data,” he explained.

Serenita concluded by urging delegates to view technology as an enabler rather than the end game: “The most important thing is to understand your data and how it is used.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to simplify and modernize data architecture to unleash data value and innovation

The data needs of financial institutions are growing at pace as new formats and greater volumes of information are integrated into their systems. With this has come greater complexity in managing and governing that data, amplifying pain points along data pipelines. In response, innovative new streamlined and flexible architectures have emerged that can absorb and...

BLOG

Being Prepared for Tomorrow Requires an Advanced Data Architecture Today

By Don Huff, Global Head of Client Services and Operations, Bloomberg and Maureen Gallagher, Head of Enterprise Reference Data, Bloomberg. Data has quickly become the hottest commodity in the financial sector: trading and investment teams are laser-focused on accessing the best, newest data to get an edge on the competition. While this arms race for...

EVENT

TradingTech Summit MENA

The inaugural TradingTech Summit MENA takes place in November and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions in the region.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...