About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Lehman Investigation Indicates Immense Scale of the Data Challenge Due to 350bn Pages of Data and “Arcane Systems”

Subscribe to our newsletter

The recently published examiner report into the Lehman bankruptcy indicates the scale of the data challenge faced when winding down a financial institution of its size: the examiner was faced with three perabytes (otherwise known as 350 billion pages) of electronically stored data to process. Unsurprisingly, given the fact that information needed to be presented before the end of the next century, the examiner was only able to collect and process five million of these documents (around 40,000,000 pages, or 0.01% of the total number of pages). This challenge was further exacerbated by the storage of this data on “arcane, outdated or non-standard” systems, said the report by Anton Valukas of Jenner & Block.

“The examiner carefully selected a group of document custodians and search terms designed to cull out the most promising subset of Lehman electronic materials for review. In addition, the examiner requested and received hard copy documents from Lehman and both electronic and hard copy documents from numerous third parties and government agencies, including the Department of the Treasury, the Securities and Exchange Commission (SEC), the Federal Reserve, FRBNY, the Office of Thrift Supervision, the SIPA Trustee, Ernst & Young, JPMorgan, Barclays, Bank of America, HSBC, Citibank, Fitch, Moody’s, S&P, and others,” states the report. Quite a list of sources from which to obtain the relevant information.

This data was then reviewed at “two levels”, according to the examiner: by lawyers in order to determine which documents were relevant to the investigation and then by subject matter experts in order to understand the implications of the data contained within them. Given the scale of this challenge, it is understandable why there has been such a focus within the regulatory community on establishing living wills legislation in order to ensure that this data is more easily accessible in a timely manner.

Daniel Tarullo, who is a member of the board of governors of the US Federal Reserve System, has been particularly vocal about this subject and the Lehman investigation certainly gives his proposals to determine a list of key data for unwinding purposes legs. After all, it took 70 contract attorneys to conduct the first level review of the Lehman data across its operating, trading, valuation, financial, accounting and other data systems: a significant endeavour indeed.

The lack of integration amongst the systems made the examiner’s job even harder, as well as the fact that at the point in time of the investigation the majority of the systems had been transferred over to Barclays. “Barclays had integrated its own proprietary and confidential data into some of the systems, so Barclays had legitimate concerns about granting access to those systems,” notes the examiner. This meant that some of the data was only available in a “read-only” format, which made the review and organisation of that data much more difficult, says the report.

However, the more significant hurdle was this “patchwork of over 2,600 software systems and applications” across which the data was being held. Instead of learning the ins and outs of each of these systems, the examiner opted to tackle only the “most promising” in terms of finding the correct data and ultimately requested access to 96 of these systems (a mere drop in the data ocean). This was also a problematic process due to the fact that these systems were “arcane, outdated or non-standard”, as well as being “highly interdependent”. The examiner also notes that the relationships between these systems was “difficult to decipher and not well documented”; just imagine what the data management department was facing every day!

As noted recently by Martin Taylor, group chief information officer at LCH.Clearnet, the fact that there was no one left to explain the data or its systems was a challenge in itself. The examiner notes: “Record keeping quickly fell into disarray upon Lehman’s hurried filing. Reconstructing data during this period has proven a challenge not only for the examiner but for all who must rely upon this data in Lehman’s Chapter 11 proceedings.”

As well as providing an insight into the details surrounding the risk management failures of a significant financial institution, the examiner’s report therefore acts as a case in point for the regulatory community with regards to establishing resolution plans. Moreover, it highlights the scale of the data management challenge facing those of a similar size to Lehman. Hopefully it will go some way towards strengthening the case for C-level buy in to the idea of a more structured approach to data.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to maximise the use of data standards and identifiers beyond compliance and in the interests of the business

Date: 18 July 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data standards and identifiers have become common currency in regulatory compliance, bringing with them improved transparency, efficiency and data quality in reporting. They also contribute to automation. But their value does not end here, with data standards and identifiers...

BLOG

SEC Disclosure Rules Welcomed but Unlikely to Fill Data Gaps

In finally announcing its long-awaited code on corporate reporting of climate-related data, the Securities and Exchange Commission (SEC) has both delighted and annoyed participants in sustainability markets. While SEC chair Gary Gensler’s announcement last week of the US’s first-ever rule on ESG disclosures will help fill a gaping chasm in the global regulatory landscape, many...

EVENT

RegTech Summit London

Now in its 8th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Corporate Actions 2009 Edition

Rather than detracting attention away from corporate actions automation projects, the financial crisis appears to have accentuated the importance of the vital nature of this data. Financial institutions are more aware than ever before of the impact that inaccurate corporate actions data has on their bottom lines as a result of the increased focus on...