About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

JPMorgan’s Chief Data Officer Peter Serenita Gets to the Hub of the Matter

Subscribe to our newsletter

JPMorgan Chase has taken an “operational hub” approach to centralising reference data for its wholesale business, which has improved efficiencies and reduced costs, with most business units seeing “paypack within one to two years”, said Peter Serenita, chief data officer, JPMorgan Worldwide Securities Services, who shared his experience at FIMA 2006 in London earlier this month. The migration of additional business units to the hubs is continuing gradually, with 14 new business units moving their data operations into the hub infrastructure through 2006.

JPMorgan chose to build its own systems for managing the data five years ago, decommissioning old systems. It created two reference data operational hubs, one in Delaware and one in Mumbai, according to Serenita, who provided food for thought for delegates about the process, from the decision of where to site the hubs and what functions to put in them, through how to handle the migration, to how to measure the results and maximise the cost and operational benefits of such a move.

The choice of Mumbai as the offshore reference data hub for JPMorgan was aligned with a bank level decision to create a global services centre there, and having such an alignment is vital, Serenita said. “If you choose a site that is not a strategic location, you will face problems with the infrastructure – you need desktop support, for example.” Another key decision to make upfront is whether to migrate the functions as they are and re-engineer them later, or re-engineer first. “To lift and drop means having to retrain later, but we did that because everyone currently doing the functions we were migrating was already geographically dispersed and organisationally aligned. We did the migration function first, and then gave the people taking on the new roles the additional responsibility of examining those roles going forward. It was a year to eighteen months before we even thought about re-engineering.”

JPMorgan wanted to move client reference data, equity and fixed income trading accounts, security master and universal data such as calendars to the hubs, and started with the “easy ones” of each type, Serenita said. Currently it has migrated around 95 per cent of client data operations, some 70 per cent of security master, and up to 100 per cent of trading accounts and universal data.

While logic might suggest moving all functions to the lowest cost hub, he suggested that business continuity requirements demand a more balanced approach. “If you are trying to recover a function at a site, you can only do that if the function is similar or close to one that already resides there,” he said. “As part of the regular process we failover every month. When these functions resided in the business, they usually had a relatively local continuity plan. But what happens if the region is out? Another thing we got from migration was a fully balanced resiliency plan.”

Serenita also offered some words of caution on metrics and SLAs. “People take the idea of the SLA too far sometimes, and it gets a bad rep,” he said. “An SLA is just about trying to understand what the business requirement is, for example, what accuracy and what level of timeliness do we need? Does the business unit understand the cost factor of data delivered within three minutes or 15?”

Metrics should certainly be applied – “if you can’t measure, you can’t manage” – but need to evolve over time as the focus shifts from phase one of this process – migrate and stabilise – towards the later phases, where the emphasis should be on becoming content-centric, to make the operational hubs “the go-to experts – the first to get a call”, he said. “If you expand the business knowledge of the team, expand their roles and responsibilities to become content-focused, this leads to improved data quality.” This means firms can shift from applying metrics such as volume, turnaround time and process quality, to taking a more sophisticated approach and measuring factors like STP rates.
JPMorgan achieved savings in salary and benefits of 47 per cent. And data quality has improved. “How do I measure that? I get less phone calls,” said Serenita.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: An update on data standards and global identifiers

Data standards and global identifiers have been parts of capital markets’ practices for many years, and more are being developed, reviewed and shaped as the industry acknowledges their role in streamlining data management, reducing risk, improving transparency, and achieving compliance. This webinar will discuss data standards and identifiers in play, as well as those in...

BLOG

Snowflake Reviews the Rise of the Chatbot

Large language models (LLMs) are increasingly being used to create chatbots with 46% of apps falling into this category in May 2023, and the percentage continuing to rise. This shows a shift from LLM applications with text-based input, 82% in 2023 and 54% in 2024, to those with iterative text input and offering the ability...

EVENT

RegTech Summit London

Now in its 8th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...