About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

JPMorgan’s Serenita Debates the Issue of Truth in the Data Management World

Subscribe to our newsletter

The success of a data management project should not be measured on achieving an idealistic version of the “truth”, it should be focused on data integration and downstream impacts, said Peter Serenita, chief data officer at JPMorgan Securities Services.

“Although the integrity and accuracy of the data is important, institutions must focus on distributing and integrating that data into downstream systems,” he told FIMA 2008 delegates this morning. “Legacy systems tend to cause problems and centralised teams need to work closely with downstream teams to avoid misinterpretation of data.”

Serenita explained that it is not realistic to hope for 100% accuracy of data, 100% of the time but this should not stop data management teams from striving to achieve a high level of quality data across an institution. “It should be looked at as a business problem rather than a data problem – we need to look at how the data that we have spent so much time cleansing and protecting is being used,” he said.

This should be attempted with some degree of pragmatism, according to Serenita, which essentially means that data is transformed into a format that can be integrated into downstream systems. “Legacy systems make up around 90% of financial institutions’ back offices and this is likely to continue indefinitely as today’s new systems are tomorrow’s legacy systems,” he added.

JPMorgan maintains a golden copy and each downstream system maintains its own interpretation of that data, he told delegates. The central data management team therefore produces the golden copy and keeps that data, while the downstream systems take that golden copy and translate it for their own purposes. “This has a range of pros and cons; the major downside is a lack of an end to end view of data,” he explained.

Serenita concluded by urging delegates to view technology as an enabler rather than the end game: “The most important thing is to understand your data and how it is used.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: End-to-End Lineage for Financial Services: The Missing Link for Both Compliance and AI Readiness

The importance of complete robust end-to-end data lineage in financial services and capital markets cannot be overstated. Without the ability to trace and verify data across its lifecycle, many critical workflows – from trade reconciliation to risk management – cannot be executed effectively. At the top of the list is regulatory compliance. Regulators demand a...

BLOG

Turning Regulation into an Advantage for UK Financial Sector SMEs

By Jon Lucas, Director and Co-Founder, Hyve Managed Hosting. While security and compliance have always been crucial pillars of cloud hosting, the landscape is shifting. New legislation and stricter regulatory frameworks are placing heavier demands on businesses – particularly in sectors like financial services – forcing companies to invest more time, and resources into ticking...

EVENT

TradingTech Summit New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

The Global LEI System – A Solution for Entity Data?

The Global LEI System – or GLEIS – has been in development since the middle of last year. Development has been patchy at times, but much has been done, leaving fewer outstanding issues, but also raising new questions. What’s emerging is a structure for the GLEIS going forward, complete with a mechanism for registering and...