About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

US FCIC Criticises State of Goldman’s Slow Production of Incomplete and Inaccurate CDO and Customer Data

Subscribe to our newsletter

Following the damning evidence provided by the Lehman examiner report earlier this year, proof yet again of the poor state of the industry’s data management systems comes this month from the US Financial Crisis Inquiry Commission’s (FCIC) recent dealings with Goldman Sachs. The regulatory body, which was established in May last year to examine the causes of the financial crisis, has been investigating data around Goldman’s synthetic and hybrid collateralised debt obligations (CDOs) based on mortgage backed securities (MBSs) and has criticised the firm’s slow and incomplete provision of the required data.

The FCIC has been forced to issue a subpoena to Goldman because it has thus far failed to provide the required data concerning its CDOs, customer names and a log of its dealings with the Senate Permanent Subcommittee on Investigations (PSI). The commission has been requesting this data since a January 2010 hearing and has repeatedly sent communications to the firm in the ensuing time for it to provide more information.

The first letter sent out by the FCIC was dated 28 January this year and detailed the list of information required to be submitted in a follow up to the hearing. A deadline of 26 February was set for the required information to be provided, largely in an Excel format. For example, the documents requested included an Excel document by which the commission wished to “determine Goldman’s customers names from the customer numbers provided in a previous production concerning Goldman derivative transactions”.

Over the subsequent four months, the commission sent out a total of 15 communications to Goldman, before it decided to issue a subpoena for the data. The data that Goldman sent initially, two months after the February deadline, was described as “incomplete and inaccurate” by the FCIC and this prompted calls for more information.

The commission noted in the following weeks that it was “becoming increasingly concerned with the slow production of documents” and threatened to issue a subpoena. Goldman then claimed it had sent the correct data but the FCIC denied this and noted: “Commission staff did not understand the continual delays and the inability or unwillingness to provide the information requested”.

Finally, Goldman produced five terabytes of documents (basically a data dump of approximately 2.5 billion pages) in one go. The regulatory body, quite understandably, did not wish to have to wade through this data and again voiced its “frustration with the failure to produce specifically identified documents and the misleading nature of Goldman’s production thus far”. To make sure Goldman was aware that a raw data dump was not appropriate, the FCIC provided a spreadsheet to guide the firm in providing the “most pressing information”. Again, the firm sent “an incomplete production”.

Given that it is unlikely that Goldman was intentionally trying to get itself in hot water with the regulator, the most likely explanation for this behaviour is the inability of the firm to be able to pull the relevant data from its systems in a timely enough manner. Much the same as Lehman, the provision of such a vast quantity of data (although 2.5 billion doesn’t compare to Lehman’s 350 billion pages) to the regulator is no use when specific information is required.

The increasing prominence of these regulatory investigations and interrogations of specific data items, be it from a risk management or regulatory reporting perspective, will therefore further raise the profile of a more structured approach to data management. The Goldman case can therefore be added to the long list of recent examples for data managers to use in their endeavours to get senior management buy in to data management projects this year.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Meeting the challenges of regulatory change

Regulatory change is constant, complex and challenging, calling on financial institutions to attend to details of change whether relatively minor or large scale. Recent regulatory changes include MiFID II post-trade transparency requirements, including ESMA’s increase in data continuity checks that brokers must prepare for, and trading venues must make, when reporting instrument reference and quantitative...

BLOG

GLEIF and Swift Reduce Cost of Reconciling Counterparty Data with Certified Mapping of MIC to LEI

The Global Legal Entity Identifier Foundation (GLEIF) has expanded its collaboration with Swift by providing certification for the mapping of Swift’s Market Identifier Code (MIC) to the Legal Entity Identifier (LEI). The resulting open source file will enable market participants that use GLEIF and/or Swift data to link and cross-reference key entity identifiers free of...

EVENT

TradingTech Summit London

Now in its 12th year the TradingTech Summit London brings together the European trading technology capital markets industry, to explore how trading firms are innovating in today’s cloud and digital based environment to create flexible, scalable trading platforms to support speed to market and business agility.

GUIDE

Regulatory Data Handbook 2022/2023 – Tenth Edition

Welcome to the tenth edition of A-Team Group’s Regulatory Data Handbook, a publication that has tracked new regulations, amendments, implementation and data management requirements as regulatory change has impacted global capital markets participants over the past 10 years. This edition of the handbook includes new regulations and highlights some of the major regulatory interventions challenging...