About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

JPMorgan’s Chief Data Officer Peter Serenita Gets to the Hub of the Matter

Subscribe to our newsletter

JPMorgan Chase has taken an “operational hub” approach to centralising reference data for its wholesale business, which has improved efficiencies and reduced costs, with most business units seeing “paypack within one to two years”, said Peter Serenita, chief data officer, JPMorgan Worldwide Securities Services, who shared his experience at FIMA 2006 in London earlier this month. The migration of additional business units to the hubs is continuing gradually, with 14 new business units moving their data operations into the hub infrastructure through 2006.

JPMorgan chose to build its own systems for managing the data five years ago, decommissioning old systems. It created two reference data operational hubs, one in Delaware and one in Mumbai, according to Serenita, who provided food for thought for delegates about the process, from the decision of where to site the hubs and what functions to put in them, through how to handle the migration, to how to measure the results and maximise the cost and operational benefits of such a move.

The choice of Mumbai as the offshore reference data hub for JPMorgan was aligned with a bank level decision to create a global services centre there, and having such an alignment is vital, Serenita said. “If you choose a site that is not a strategic location, you will face problems with the infrastructure – you need desktop support, for example.” Another key decision to make upfront is whether to migrate the functions as they are and re-engineer them later, or re-engineer first. “To lift and drop means having to retrain later, but we did that because everyone currently doing the functions we were migrating was already geographically dispersed and organisationally aligned. We did the migration function first, and then gave the people taking on the new roles the additional responsibility of examining those roles going forward. It was a year to eighteen months before we even thought about re-engineering.”

JPMorgan wanted to move client reference data, equity and fixed income trading accounts, security master and universal data such as calendars to the hubs, and started with the “easy ones” of each type, Serenita said. Currently it has migrated around 95 per cent of client data operations, some 70 per cent of security master, and up to 100 per cent of trading accounts and universal data.

While logic might suggest moving all functions to the lowest cost hub, he suggested that business continuity requirements demand a more balanced approach. “If you are trying to recover a function at a site, you can only do that if the function is similar or close to one that already resides there,” he said. “As part of the regular process we failover every month. When these functions resided in the business, they usually had a relatively local continuity plan. But what happens if the region is out? Another thing we got from migration was a fully balanced resiliency plan.”

Serenita also offered some words of caution on metrics and SLAs. “People take the idea of the SLA too far sometimes, and it gets a bad rep,” he said. “An SLA is just about trying to understand what the business requirement is, for example, what accuracy and what level of timeliness do we need? Does the business unit understand the cost factor of data delivered within three minutes or 15?”

Metrics should certainly be applied – “if you can’t measure, you can’t manage” – but need to evolve over time as the focus shifts from phase one of this process – migrate and stabilise – towards the later phases, where the emphasis should be on becoming content-centric, to make the operational hubs “the go-to experts – the first to get a call”, he said. “If you expand the business knowledge of the team, expand their roles and responsibilities to become content-focused, this leads to improved data quality.” This means firms can shift from applying metrics such as volume, turnaround time and process quality, to taking a more sophisticated approach and measuring factors like STP rates.
JPMorgan achieved savings in salary and benefits of 47 per cent. And data quality has improved. “How do I measure that? I get less phone calls,” said Serenita.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: ESG data sourcing and management to meet your ESG strategy, objectives and timeline

Date: 11 June 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes ESG data plays a key role in research, fund product development, fund selection, asset selection, performance tracking, and client and regulatory reporting, yet it is not always easy to source and manage in a complete, transparent and timely manner....

BLOG

SIX Invests in BITA to Enhance Global Benchmark Platform and Expand Indexing Technology Services

SIX, the Swiss financial data and market infrastructure provider, has made a strategic investment in BITA, the indexing technology and services company. The move aims to bolster a range of ongoing joint projects and fast-track the global expansion of SIX’s benchmark platform. Existing BITA shareholders, including ETFS Capital, Volta Ventures, and Pamica NV, have also...

EVENT

Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Corporate Actions 2009 Edition

Rather than detracting attention away from corporate actions automation projects, the financial crisis appears to have accentuated the importance of the vital nature of this data. Financial institutions are more aware than ever before of the impact that inaccurate corporate actions data has on their bottom lines as a result of the increased focus on...