The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

US FCIC Criticises State of Goldman’s Slow Production of Incomplete and Inaccurate CDO and Customer Data

Share article

Following the damning evidence provided by the Lehman examiner report earlier this year, proof yet again of the poor state of the industry’s data management systems comes this month from the US Financial Crisis Inquiry Commission’s (FCIC) recent dealings with Goldman Sachs. The regulatory body, which was established in May last year to examine the causes of the financial crisis, has been investigating data around Goldman’s synthetic and hybrid collateralised debt obligations (CDOs) based on mortgage backed securities (MBSs) and has criticised the firm’s slow and incomplete provision of the required data.

The FCIC has been forced to issue a subpoena to Goldman because it has thus far failed to provide the required data concerning its CDOs, customer names and a log of its dealings with the Senate Permanent Subcommittee on Investigations (PSI). The commission has been requesting this data since a January 2010 hearing and has repeatedly sent communications to the firm in the ensuing time for it to provide more information.

The first letter sent out by the FCIC was dated 28 January this year and detailed the list of information required to be submitted in a follow up to the hearing. A deadline of 26 February was set for the required information to be provided, largely in an Excel format. For example, the documents requested included an Excel document by which the commission wished to “determine Goldman’s customers names from the customer numbers provided in a previous production concerning Goldman derivative transactions”.

Over the subsequent four months, the commission sent out a total of 15 communications to Goldman, before it decided to issue a subpoena for the data. The data that Goldman sent initially, two months after the February deadline, was described as “incomplete and inaccurate” by the FCIC and this prompted calls for more information.

The commission noted in the following weeks that it was “becoming increasingly concerned with the slow production of documents” and threatened to issue a subpoena. Goldman then claimed it had sent the correct data but the FCIC denied this and noted: “Commission staff did not understand the continual delays and the inability or unwillingness to provide the information requested”.

Finally, Goldman produced five terabytes of documents (basically a data dump of approximately 2.5 billion pages) in one go. The regulatory body, quite understandably, did not wish to have to wade through this data and again voiced its “frustration with the failure to produce specifically identified documents and the misleading nature of Goldman’s production thus far”. To make sure Goldman was aware that a raw data dump was not appropriate, the FCIC provided a spreadsheet to guide the firm in providing the “most pressing information”. Again, the firm sent “an incomplete production”.

Given that it is unlikely that Goldman was intentionally trying to get itself in hot water with the regulator, the most likely explanation for this behaviour is the inability of the firm to be able to pull the relevant data from its systems in a timely enough manner. Much the same as Lehman, the provision of such a vast quantity of data (although 2.5 billion doesn’t compare to Lehman’s 350 billion pages) to the regulator is no use when specific information is required.

The increasing prominence of these regulatory investigations and interrogations of specific data items, be it from a risk management or regulatory reporting perspective, will therefore further raise the profile of a more structured approach to data management. The Goldman case can therefore be added to the long list of recent examples for data managers to use in their endeavours to get senior management buy in to data management projects this year.

Related content

WEBINAR

Recorded Webinar: How far should counterparty screening go? Balancing the ideal and the realistic

Counterparty screening is a regulatory requirement, but do you know enough about your clients’ clients, and beyond? How can you source this information and how does it benefit your business? How far do you need to dig into entity ownership structures? This webinar discusses these challenges and how they relate to your organisation, whether you’re...

BLOG

Refinitiv Acquires The Red Flag Group to Boost ESG, Supply Chain Capabilities

Refinitiv last week acquired The Red Flag Group, a Hong Kong-based global integrity and compliance risk firm specializing in enhanced due diligence and onboarding technologies across a range of industries. Its addition should significantly expand Refinitiv’s suite of due diligence offerings to help its corporate compliance customers better evaluate money laundering, bribery and corruption, reputational and ESG...

EVENT

RegTech Summit Virtual

The RegTech Summit Virtual which took place in June 2020 was a huge success with over 1,100 delegates registered. We are currently working on our plans for 2021 and we hope to be back with an in-person event. Whatever the future holds you can guarantee our 2021 event will be back with an exceptional guest speaker line up of Regtech practitioners, regulators, start-ups and solution providers to collaborate and discuss innovative and effective approaches for building a better regulatory environment. Can't wait until 2021? make sure you sign up to our RegTech Summit Virtual, November 2020. More info...

GUIDE

Regulatory Data Handbook 2020/2021 – Eighth Edition

This eighth edition of A-Team Group’s Regulatory Data Handbook is a ‘must-have’ for capital markets participants during this period of unprecedented change. Available free of charge, it profiles every regulation that impacts capital markets data management practices giving you: A detailed overview of each regulation with key dates, data and data management implications, links to...