About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Perry Discusses Goldman Sachs’ Creation of a Central Instrument Reference Database for its Global Operations

Subscribe to our newsletter

Goldman Sachs has taken a step by step approach towards developing a centralised instrument reference database to support its global operations, according to Jim Perry, vice president of product data quality at the investment bank. Speaking at FIMA earlier this month in London (in addition to his earlier panel slot on regulation), Perry elaborated on how the firm began with US listed equities and migrated each instrument database from the individual business line level to a centralised repository sitting under the operations function.

The main driver behind the move was the exposure of Goldman’s reference data directly to end clients via the internet, said Perry. The firm began with the US and then Europe and, finally, tackled its Asian-based operations. “We built upon each success story to tackle the next and tried to take into account the different uses of the data by different functions such as for client reporting or risk,” said Perry. Of course, this global footprint also complicated matters due to the different regulatory regimes in place in each country and the need to meet various data requirements.

The rationale behind the move to centralise was that the data management function had more knowledge than the front office and other functions about data quality issues and was therefore better able to deal with them. “If data is controlled too far downstream, then data quality can suffer,” he contended. “If you are serious about reference data, you need to ring fence it and put it under the control of a team whose sole function is to ensure quality.”

The data management function currently has 24/6 coverage and is therefore spread over five locations, each with technical presence, he explained. The focus was initially on supporting the clearing and settlement function, but is now increasingly about pre-trade data support, hence the timeliness of data is much more important, said Perry. “The time scale is no longer end of day, it is now before trading.”

Perry noted that the overall implementation “could have gone better”, as the team had to fill its central repository directly with the downstream data without tackling data quality issues first. The downstream data errors took a while to deal with and he noted that a vendor solution rather than an internal build may have been an easier option overall, giving the team more time to tackle the quality issues at the outset rather than taking the impurities upstream.

As for ongoing challenges, Perry indicated that ensuring data completeness is important to ensure that STP is achieved, as well as understanding the needs of downstream consumers of the data. The firm has set up a steering committee from the data function and the IT function in order to determine the resources needed for new projects, he explained. “Over time we have been able to turn off legacy systems and downstream consumers now recognise reference data as an asset,” he said.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: A practical guide to dual UK and EU regulatory reporting as the Temporary Permission Regime comes to a close

The Temporary Permission Regime (TPR) allowing capital markets participants in the European Economic Area (EEA) to continue to operate in the UK post Brexit will be withdrawn by the end of 2023, calling on firms that want to stay in the UK to gain full authorisation from the FCA and prepare to comply with both...

BLOG

Qlik Extends Cloud Analytics Services for Snowflake

Qlik, a provider of real-time data integration and analytics, has extended its cloud analytics services for Snowflake with features designed to help customers drive more value from near real-time data when deploying Qlik’s cloud platform alongside Snowflake. The additional Software-as-a-Service (SaaS) capabilities include Direct Query and enhanced Qlik Cloud Data Services for Snowflake. Direct Query...

EVENT

Virtual Briefing: ESG Data Management – A Strategic Imperative (Redirected)

This briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...