About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Perry Discusses Goldman Sachs’ Creation of a Central Instrument Reference Database for its Global Operations

Subscribe to our newsletter

Goldman Sachs has taken a step by step approach towards developing a centralised instrument reference database to support its global operations, according to Jim Perry, vice president of product data quality at the investment bank. Speaking at FIMA earlier this month in London (in addition to his earlier panel slot on regulation), Perry elaborated on how the firm began with US listed equities and migrated each instrument database from the individual business line level to a centralised repository sitting under the operations function.

The main driver behind the move was the exposure of Goldman’s reference data directly to end clients via the internet, said Perry. The firm began with the US and then Europe and, finally, tackled its Asian-based operations. “We built upon each success story to tackle the next and tried to take into account the different uses of the data by different functions such as for client reporting or risk,” said Perry. Of course, this global footprint also complicated matters due to the different regulatory regimes in place in each country and the need to meet various data requirements.

The rationale behind the move to centralise was that the data management function had more knowledge than the front office and other functions about data quality issues and was therefore better able to deal with them. “If data is controlled too far downstream, then data quality can suffer,” he contended. “If you are serious about reference data, you need to ring fence it and put it under the control of a team whose sole function is to ensure quality.”

The data management function currently has 24/6 coverage and is therefore spread over five locations, each with technical presence, he explained. The focus was initially on supporting the clearing and settlement function, but is now increasingly about pre-trade data support, hence the timeliness of data is much more important, said Perry. “The time scale is no longer end of day, it is now before trading.”

Perry noted that the overall implementation “could have gone better”, as the team had to fill its central repository directly with the downstream data without tackling data quality issues first. The downstream data errors took a while to deal with and he noted that a vendor solution rather than an internal build may have been an easier option overall, giving the team more time to tackle the quality issues at the outset rather than taking the impurities upstream.

As for ongoing challenges, Perry indicated that ensuring data completeness is important to ensure that STP is achieved, as well as understanding the needs of downstream consumers of the data. The firm has set up a steering committee from the data function and the IT function in order to determine the resources needed for new projects, he explained. “Over time we have been able to turn off legacy systems and downstream consumers now recognise reference data as an asset,” he said.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

4 September 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and...

BLOG

AI is Helping to Solve New ESG Data Challenges: ESG Briefing Review

The peculiar demands that ESG data integration places on capital markets participants requires powerful techniques that are increasingly being provided through artificial intelligence, A-Team Group’s recent ESG Data and Tech Briefing London heard. From data quality monitoring and analytics to supply chain analysis and investment management, AI-based tools are already offering automated solutions to some...

EVENT

Data Licensing Forum 2025

The Data Licensing Forum will explore industry trends, themes and specific developments relating to licensing of financial market data from Exchanges and Data Vendors.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...