About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

JPMorgan’s Serenita Debates the Issue of Truth in the Data Management World

Subscribe to our newsletter

The success of a data management project should not be measured on achieving an idealistic version of the “truth”, it should be focused on data integration and downstream impacts, said Peter Serenita, chief data officer at JPMorgan Securities Services.

“Although the integrity and accuracy of the data is important, institutions must focus on distributing and integrating that data into downstream systems,” he told FIMA 2008 delegates this morning. “Legacy systems tend to cause problems and centralised teams need to work closely with downstream teams to avoid misinterpretation of data.”

Serenita explained that it is not realistic to hope for 100% accuracy of data, 100% of the time but this should not stop data management teams from striving to achieve a high level of quality data across an institution. “It should be looked at as a business problem rather than a data problem – we need to look at how the data that we have spent so much time cleansing and protecting is being used,” he said.

This should be attempted with some degree of pragmatism, according to Serenita, which essentially means that data is transformed into a format that can be integrated into downstream systems. “Legacy systems make up around 90% of financial institutions’ back offices and this is likely to continue indefinitely as today’s new systems are tomorrow’s legacy systems,” he added.

JPMorgan maintains a golden copy and each downstream system maintains its own interpretation of that data, he told delegates. The central data management team therefore produces the golden copy and keeps that data, while the downstream systems take that golden copy and translate it for their own purposes. “This has a range of pros and cons; the major downside is a lack of an end to end view of data,” he explained.

Serenita concluded by urging delegates to view technology as an enabler rather than the end game: “The most important thing is to understand your data and how it is used.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to organise, integrate and structure data for successful AI

Artificial intelligence (AI) is increasingly being rolled out across financial institutions, being put to work in applications that are transforming everything from back-office data management to front-office trading platforms. The potential for AI to bring further cost-savings and operational gains are limited only by the imaginations of individual organisations. What they all require to achieve...

BLOG

Twelve Leading Data Lineage Solutions for Capital Markets

The ability to trace the journey of data from its origin to its final report is no longer a luxury but a regulatory and operational necessity. As firms grapple with the intensifying requirements of regulations such as BCBS 239, GDPR and the shifting landscape of MiFID II, the “black box” approach to data management has...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Best Practice Client Onboarding

Client onboarding is central to the success of banks, yet it continues to present challenges and the benefits of getting it right are difficult to achieve. The challenges arise from siloed systems, manual processes and poor entity data quality. The potential benefits of successful implementation include excellent client experience, improved client acquisition and loyalty, new business opportunities, reductions in costs, competitive advantage, and confidence in compliance.