About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Perry Discusses Goldman Sachs’ Creation of a Central Instrument Reference Database for its Global Operations

Subscribe to our newsletter

Goldman Sachs has taken a step by step approach towards developing a centralised instrument reference database to support its global operations, according to Jim Perry, vice president of product data quality at the investment bank. Speaking at FIMA earlier this month in London (in addition to his earlier panel slot on regulation), Perry elaborated on how the firm began with US listed equities and migrated each instrument database from the individual business line level to a centralised repository sitting under the operations function.

The main driver behind the move was the exposure of Goldman’s reference data directly to end clients via the internet, said Perry. The firm began with the US and then Europe and, finally, tackled its Asian-based operations. “We built upon each success story to tackle the next and tried to take into account the different uses of the data by different functions such as for client reporting or risk,” said Perry. Of course, this global footprint also complicated matters due to the different regulatory regimes in place in each country and the need to meet various data requirements.

The rationale behind the move to centralise was that the data management function had more knowledge than the front office and other functions about data quality issues and was therefore better able to deal with them. “If data is controlled too far downstream, then data quality can suffer,” he contended. “If you are serious about reference data, you need to ring fence it and put it under the control of a team whose sole function is to ensure quality.”

The data management function currently has 24/6 coverage and is therefore spread over five locations, each with technical presence, he explained. The focus was initially on supporting the clearing and settlement function, but is now increasingly about pre-trade data support, hence the timeliness of data is much more important, said Perry. “The time scale is no longer end of day, it is now before trading.”

Perry noted that the overall implementation “could have gone better”, as the team had to fill its central repository directly with the downstream data without tackling data quality issues first. The downstream data errors took a while to deal with and he noted that a vendor solution rather than an internal build may have been an easier option overall, giving the team more time to tackle the quality issues at the outset rather than taking the impurities upstream.

As for ongoing challenges, Perry indicated that ensuring data completeness is important to ensure that STP is achieved, as well as understanding the needs of downstream consumers of the data. The firm has set up a steering committee from the data function and the IT function in order to determine the resources needed for new projects, he explained. “Over time we have been able to turn off legacy systems and downstream consumers now recognise reference data as an asset,” he said.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The challenges and potential of data marketplaces

Data is the lifeblood of capital markets. It is also a valuable commodity providing financial institutions with additional insight when gathered in an internal data marketplace, or packaged and sold externally to other institutions. While the theory is sound, the practice of setting up a data marketplace can be challenging. Internally, vast amounts of data...

BLOG

Moving from Legacy Systems to an Interoperable Future – Why Innovation is No Longer Big Bang

In this Q&A, Stephen Collie, head of sales engineering at FINBOURNE Technology, discusses the core data management challenges paralysing the buy-side today, and explores the critical role of a Modern Financial Data Stack and incremental innovation in the transition to an interoperable future. Q: Stephen, given FINBOURNE’s work with organisations across the global investment community,...

EVENT

A-Team Briefing: Cloud Innovation for Data Ops

This Innovation Briefing will explore approaches to data infrastructure transformation, technologies required and how to make sure processes are optimised to support real time data management. Hear from leading practitioners and innovative technology solution providers who will share insight into how to set up and leverage your data infrastructure to provide user access to consistent data and analytics, and companies the ability to monetise their data.

GUIDE

Regulatory Data Handbook 2022/2023 – Tenth Edition

Welcome to the tenth edition of A-Team Group’s Regulatory Data Handbook, a publication that has tracked new regulations, amendments, implementation and data management requirements as regulatory change has impacted global capital markets participants over the past 10 years. This edition of the handbook includes new regulations and highlights some of the major regulatory interventions challenging...