About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

An Incremental and Practical Approach to Data Integration is Best, Agrees Golden Copy Repository Panel

Subscribe to our newsletter

In order to ensure that downstream departments remain in synch with central data management projects, firms must adopt an incremental and practical approach to data integration, agreed FIMA’s panel on the practicalities of managing a golden copy repository. Peter Giordano, executive director of institutional equities at Oppenheimer & Co, recommended that firms roll out the projects piece by piece rather than take a big bang approach to data management.

Chris Johnson, head of data management, Institutional Fund Services, Europe, for HSBC Securities Services, told delegates to focus on meeting the needs of “downstream consumers” by getting users around the table to discuss their data needs. “You also need to allow for differences across client groups,” he said.

Julia Sutton, global head of customer accounts operations at Citi, agreed that communication with user groups is key: “You need to let them vent and tell you the problems they are experiencing as well as what they want from their data in the future.”

Giordano also contended that time should be well spent before a project is launched to think of all the possible future requirements of the data system from user groups. He warned delegates to be aware that future data requirements may be changed by events such as mergers and acquisitions.

HSBC’s Johnson said that standardisation can only go so far as there remain complex downstream issues that cannot be standardised. “You should attempt to tackle as much as you can but understand that this cannot be done for every single area,” he said.
Sutton added that the data management endeavour is similar to “constant firefighting” and teams need to “keep digging away at it” to succeed.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to simplify and modernize data architecture to unleash data value and innovation

The data needs of financial institutions are growing at pace as new formats and greater volumes of information are integrated into their systems. With this has come greater complexity in managing and governing that data, amplifying pain points along data pipelines. In response, innovative new streamlined and flexible architectures have emerged that can absorb and...

BLOG

Uncovering Data Anomalies: 16 Data Observability Solutions for Capital Markets

Financial institutions’ operational resilience depends largely on the integrity of their data and the applications it feeds. The huge volume of data that modern organisations ingest makes this a challenge. The accuracy, completeness and timeliness of critical data can be improved if it is monitored and checked as it moves through increasingly intricate data pipelines...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...