About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

US FCIC Criticises State of Goldman’s Slow Production of Incomplete and Inaccurate CDO and Customer Data

Subscribe to our newsletter

Following the damning evidence provided by the Lehman examiner report earlier this year, proof yet again of the poor state of the industry’s data management systems comes this month from the US Financial Crisis Inquiry Commission’s (FCIC) recent dealings with Goldman Sachs. The regulatory body, which was established in May last year to examine the causes of the financial crisis, has been investigating data around Goldman’s synthetic and hybrid collateralised debt obligations (CDOs) based on mortgage backed securities (MBSs) and has criticised the firm’s slow and incomplete provision of the required data.

The FCIC has been forced to issue a subpoena to Goldman because it has thus far failed to provide the required data concerning its CDOs, customer names and a log of its dealings with the Senate Permanent Subcommittee on Investigations (PSI). The commission has been requesting this data since a January 2010 hearing and has repeatedly sent communications to the firm in the ensuing time for it to provide more information.

The first letter sent out by the FCIC was dated 28 January this year and detailed the list of information required to be submitted in a follow up to the hearing. A deadline of 26 February was set for the required information to be provided, largely in an Excel format. For example, the documents requested included an Excel document by which the commission wished to “determine Goldman’s customers names from the customer numbers provided in a previous production concerning Goldman derivative transactions”.

Over the subsequent four months, the commission sent out a total of 15 communications to Goldman, before it decided to issue a subpoena for the data. The data that Goldman sent initially, two months after the February deadline, was described as “incomplete and inaccurate” by the FCIC and this prompted calls for more information.

The commission noted in the following weeks that it was “becoming increasingly concerned with the slow production of documents” and threatened to issue a subpoena. Goldman then claimed it had sent the correct data but the FCIC denied this and noted: “Commission staff did not understand the continual delays and the inability or unwillingness to provide the information requested”.

Finally, Goldman produced five terabytes of documents (basically a data dump of approximately 2.5 billion pages) in one go. The regulatory body, quite understandably, did not wish to have to wade through this data and again voiced its “frustration with the failure to produce specifically identified documents and the misleading nature of Goldman’s production thus far”. To make sure Goldman was aware that a raw data dump was not appropriate, the FCIC provided a spreadsheet to guide the firm in providing the “most pressing information”. Again, the firm sent “an incomplete production”.

Given that it is unlikely that Goldman was intentionally trying to get itself in hot water with the regulator, the most likely explanation for this behaviour is the inability of the firm to be able to pull the relevant data from its systems in a timely enough manner. Much the same as Lehman, the provision of such a vast quantity of data (although 2.5 billion doesn’t compare to Lehman’s 350 billion pages) to the regulator is no use when specific information is required.

The increasing prominence of these regulatory investigations and interrogations of specific data items, be it from a risk management or regulatory reporting perspective, will therefore further raise the profile of a more structured approach to data management. The Goldman case can therefore be added to the long list of recent examples for data managers to use in their endeavours to get senior management buy in to data management projects this year.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best Practices for Managing Trade Surveillance

The surge in trading volumes combined with the emergence of new digital financial assets and geopolitical events have added layers of complexity to market activities. Traditional surveillance methods often struggle to keep pace with these changes, leading to difficulties in detecting sophisticated market abuses and increased regulatory risk. To address these challenges, financial institutions are...

BLOG

ITRS Acquires IP-Label to Expand Digital Experience Monitoring Capabilities

ITRS, the performance monitoring and analytics provider, has agreed to acquire IP-Label, the Paris-based specialist in Digital Experience Monitoring (DEM) and performance analytics, with the aim of strengthening its DEM capabilities and expanding its presence in Europe. The acquisition brings IP-Label’s Ekara platform into the ITRS portfolio, adding capabilities including Synthetic Transaction Monitoring (STM), Real...

EVENT

ExchangeTech Summit London

A-Team Group, organisers of the TradingTech Summits, are pleased to announce the inaugural ExchangeTech Summit London on May 14th 2026. This dedicated forum brings together operators of exchanges, alternative execution venues and digital asset platforms with the ecosystem of vendors driving the future of matching engines, surveillance and market access.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...