About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Perry Discusses Goldman Sachs’ Creation of a Central Instrument Reference Database for its Global Operations

Subscribe to our newsletter

Goldman Sachs has taken a step by step approach towards developing a centralised instrument reference database to support its global operations, according to Jim Perry, vice president of product data quality at the investment bank. Speaking at FIMA earlier this month in London (in addition to his earlier panel slot on regulation), Perry elaborated on how the firm began with US listed equities and migrated each instrument database from the individual business line level to a centralised repository sitting under the operations function.

The main driver behind the move was the exposure of Goldman’s reference data directly to end clients via the internet, said Perry. The firm began with the US and then Europe and, finally, tackled its Asian-based operations. “We built upon each success story to tackle the next and tried to take into account the different uses of the data by different functions such as for client reporting or risk,” said Perry. Of course, this global footprint also complicated matters due to the different regulatory regimes in place in each country and the need to meet various data requirements.

The rationale behind the move to centralise was that the data management function had more knowledge than the front office and other functions about data quality issues and was therefore better able to deal with them. “If data is controlled too far downstream, then data quality can suffer,” he contended. “If you are serious about reference data, you need to ring fence it and put it under the control of a team whose sole function is to ensure quality.”

The data management function currently has 24/6 coverage and is therefore spread over five locations, each with technical presence, he explained. The focus was initially on supporting the clearing and settlement function, but is now increasingly about pre-trade data support, hence the timeliness of data is much more important, said Perry. “The time scale is no longer end of day, it is now before trading.”

Perry noted that the overall implementation “could have gone better”, as the team had to fill its central repository directly with the downstream data without tackling data quality issues first. The downstream data errors took a while to deal with and he noted that a vendor solution rather than an internal build may have been an easier option overall, giving the team more time to tackle the quality issues at the outset rather than taking the impurities upstream.

As for ongoing challenges, Perry indicated that ensuring data completeness is important to ensure that STP is achieved, as well as understanding the needs of downstream consumers of the data. The firm has set up a steering committee from the data function and the IT function in order to determine the resources needed for new projects, he explained. “Over time we have been able to turn off legacy systems and downstream consumers now recognise reference data as an asset,” he said.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The roles of cloud and managed services in optimising enterprise data management

Cloud and managed services go hand-in-hand when it comes to modern enterprise data management (EDM), but how can the best solutions for the business be selected and implemented to ensure optimal data management based on accurate, consistent, and high-quality data, and delivering improved efficiency, better decisions and competitive advantage? This webinar will answer these questions,...

BLOG

Saifr Addresses Need for Continuous Due Diligence in Marcomms Content Compliance

Ensuring financial services’ firms marketing messages accurately reflect what’s really on offer has long been a challenge for compliance professionals. And with other areas of the business attracting substantial and highly public penalties from regulators, the danger was that this function didn’t get the attention it deserved.  But regulators are starting to pay more attention,...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Entity Data Management Handbook – Third Edition

Welcome to the third edition of the Entity Data Management Handbook which is available for free download. In this updated edition we delve into the role entity data plays in the smooth running of financial institutions and capital markets, the challenges of attaining high quality data, and various aspects, approaches and technologies involved in managing...