The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

JPMorgan’s Chief Data Officer Peter Serenita Gets to the Hub of the Matter

JPMorgan Chase has taken an “operational hub” approach to centralising reference data for its wholesale business, which has improved efficiencies and reduced costs, with most business units seeing “paypack within one to two years”, said Peter Serenita, chief data officer, JPMorgan Worldwide Securities Services, who shared his experience at FIMA 2006 in London earlier this month. The migration of additional business units to the hubs is continuing gradually, with 14 new business units moving their data operations into the hub infrastructure through 2006.

JPMorgan chose to build its own systems for managing the data five years ago, decommissioning old systems. It created two reference data operational hubs, one in Delaware and one in Mumbai, according to Serenita, who provided food for thought for delegates about the process, from the decision of where to site the hubs and what functions to put in them, through how to handle the migration, to how to measure the results and maximise the cost and operational benefits of such a move.

The choice of Mumbai as the offshore reference data hub for JPMorgan was aligned with a bank level decision to create a global services centre there, and having such an alignment is vital, Serenita said. “If you choose a site that is not a strategic location, you will face problems with the infrastructure – you need desktop support, for example.” Another key decision to make upfront is whether to migrate the functions as they are and re-engineer them later, or re-engineer first. “To lift and drop means having to retrain later, but we did that because everyone currently doing the functions we were migrating was already geographically dispersed and organisationally aligned. We did the migration function first, and then gave the people taking on the new roles the additional responsibility of examining those roles going forward. It was a year to eighteen months before we even thought about re-engineering.”

JPMorgan wanted to move client reference data, equity and fixed income trading accounts, security master and universal data such as calendars to the hubs, and started with the “easy ones” of each type, Serenita said. Currently it has migrated around 95 per cent of client data operations, some 70 per cent of security master, and up to 100 per cent of trading accounts and universal data.

While logic might suggest moving all functions to the lowest cost hub, he suggested that business continuity requirements demand a more balanced approach. “If you are trying to recover a function at a site, you can only do that if the function is similar or close to one that already resides there,” he said. “As part of the regular process we failover every month. When these functions resided in the business, they usually had a relatively local continuity plan. But what happens if the region is out? Another thing we got from migration was a fully balanced resiliency plan.”

Serenita also offered some words of caution on metrics and SLAs. “People take the idea of the SLA too far sometimes, and it gets a bad rep,” he said. “An SLA is just about trying to understand what the business requirement is, for example, what accuracy and what level of timeliness do we need? Does the business unit understand the cost factor of data delivered within three minutes or 15?”

Metrics should certainly be applied – “if you can’t measure, you can’t manage” – but need to evolve over time as the focus shifts from phase one of this process – migrate and stabilise – towards the later phases, where the emphasis should be on becoming content-centric, to make the operational hubs “the go-to experts – the first to get a call”, he said. “If you expand the business knowledge of the team, expand their roles and responsibilities to become content-focused, this leads to improved data quality.” This means firms can shift from applying metrics such as volume, turnaround time and process quality, to taking a more sophisticated approach and measuring factors like STP rates.
JPMorgan achieved savings in salary and benefits of 47 per cent. And data quality has improved. “How do I measure that? I get less phone calls,” said Serenita.

Related content

WEBINAR

Recorded Webinar: Evolution of data management for the buy-side 2021

The buy-side faced a barrage of regulation in 2020 and is now under pressure to make post-Brexit adjustments and complete LIBOR transition by the end of 2021. To ensure compliance and ease the burden of in-house data management, many firms turned to outsourcing and managed services. But there is more to come, as buy-side firms...

BLOG

DTCC Data Services Aggregates Back-Office Data to Provide Front-Office Market Insight

DTCC Data Services is tapping the vast quantities of post-trade data that flow through DTCC on a daily basis to deliver back-office data to front-office data products offering market insight into areas such as liquidity, risk, momentum, sentiment and correlation. Early users of the historical data products include hedge funds and quant traders looking for...

EVENT

ESG Regulation, Reporting & Data Management Summit (Redirected)

This summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Risk & Compliance

The current financial climate has meant that risk management and compliance requirements are never far from the minds of the boards of financial institutions. In order to meet the slew of regulations on the horizon, firms are being compelled to invest in their systems in order to cope with the new requirements. Data management is...