The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

US FCIC Criticises State of Goldman’s Slow Production of Incomplete and Inaccurate CDO and Customer Data

Following the damning evidence provided by the Lehman examiner report earlier this year, proof yet again of the poor state of the industry’s data management systems comes this month from the US Financial Crisis Inquiry Commission’s (FCIC) recent dealings with Goldman Sachs. The regulatory body, which was established in May last year to examine the causes of the financial crisis, has been investigating data around Goldman’s synthetic and hybrid collateralised debt obligations (CDOs) based on mortgage backed securities (MBSs) and has criticised the firm’s slow and incomplete provision of the required data.

The FCIC has been forced to issue a subpoena to Goldman because it has thus far failed to provide the required data concerning its CDOs, customer names and a log of its dealings with the Senate Permanent Subcommittee on Investigations (PSI). The commission has been requesting this data since a January 2010 hearing and has repeatedly sent communications to the firm in the ensuing time for it to provide more information.

The first letter sent out by the FCIC was dated 28 January this year and detailed the list of information required to be submitted in a follow up to the hearing. A deadline of 26 February was set for the required information to be provided, largely in an Excel format. For example, the documents requested included an Excel document by which the commission wished to “determine Goldman’s customers names from the customer numbers provided in a previous production concerning Goldman derivative transactions”.

Over the subsequent four months, the commission sent out a total of 15 communications to Goldman, before it decided to issue a subpoena for the data. The data that Goldman sent initially, two months after the February deadline, was described as “incomplete and inaccurate” by the FCIC and this prompted calls for more information.

The commission noted in the following weeks that it was “becoming increasingly concerned with the slow production of documents” and threatened to issue a subpoena. Goldman then claimed it had sent the correct data but the FCIC denied this and noted: “Commission staff did not understand the continual delays and the inability or unwillingness to provide the information requested”.

Finally, Goldman produced five terabytes of documents (basically a data dump of approximately 2.5 billion pages) in one go. The regulatory body, quite understandably, did not wish to have to wade through this data and again voiced its “frustration with the failure to produce specifically identified documents and the misleading nature of Goldman’s production thus far”. To make sure Goldman was aware that a raw data dump was not appropriate, the FCIC provided a spreadsheet to guide the firm in providing the “most pressing information”. Again, the firm sent “an incomplete production”.

Given that it is unlikely that Goldman was intentionally trying to get itself in hot water with the regulator, the most likely explanation for this behaviour is the inability of the firm to be able to pull the relevant data from its systems in a timely enough manner. Much the same as Lehman, the provision of such a vast quantity of data (although 2.5 billion doesn’t compare to Lehman’s 350 billion pages) to the regulator is no use when specific information is required.

The increasing prominence of these regulatory investigations and interrogations of specific data items, be it from a risk management or regulatory reporting perspective, will therefore further raise the profile of a more structured approach to data management. The Goldman case can therefore be added to the long list of recent examples for data managers to use in their endeavours to get senior management buy in to data management projects this year.

Related content

WEBINAR

Recorded Webinar: How to run effective client onboarding and KYC processes

Increasing cost, complexity and regulatory change continue to challenge firms implementing client onboarding and Know Your Customer (KYC) systems. With an effective strategy and a clearly defined pathway, it’s possible to gain a valuable competitive advantage whilst meeting those all-important compliance requirements. But how to get there? With a myriad of different options out there...

BLOG

Fund Managers Need a New Approach to Data to Overcome the Coronavirus Crisis

By Angie Walker, Global Head of Capital Markets Business Development at R3. Businesses across the globe have implemented digital transformation strategies on an unprecedented scale as they battle against the fallout from the coronavirus pandemic – and fund managers are no exception. Fundamental digital transformation requires a totally new approach to the way in which data...

EVENT

RegTech Summit New York City

Now in its 5th year, the RegTech Summit in NYC explores how the North American financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2020/2021 – Eighth Edition

This eighth edition of A-Team Group’s Regulatory Data Handbook is a ‘must-have’ for capital markets participants during this period of unprecedented change. Available free of charge, it profiles every regulation that impacts capital markets data management practices giving you: A detailed overview of each regulation with key dates, data and data management implications, links to...