About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Lehman Investigation Indicates Immense Scale of the Data Challenge Due to 350bn Pages of Data and “Arcane Systems”

Subscribe to our newsletter

The recently published examiner report into the Lehman bankruptcy indicates the scale of the data challenge faced when winding down a financial institution of its size: the examiner was faced with three perabytes (otherwise known as 350 billion pages) of electronically stored data to process. Unsurprisingly, given the fact that information needed to be presented before the end of the next century, the examiner was only able to collect and process five million of these documents (around 40,000,000 pages, or 0.01% of the total number of pages). This challenge was further exacerbated by the storage of this data on “arcane, outdated or non-standard” systems, said the report by Anton Valukas of Jenner & Block.

“The examiner carefully selected a group of document custodians and search terms designed to cull out the most promising subset of Lehman electronic materials for review. In addition, the examiner requested and received hard copy documents from Lehman and both electronic and hard copy documents from numerous third parties and government agencies, including the Department of the Treasury, the Securities and Exchange Commission (SEC), the Federal Reserve, FRBNY, the Office of Thrift Supervision, the SIPA Trustee, Ernst & Young, JPMorgan, Barclays, Bank of America, HSBC, Citibank, Fitch, Moody’s, S&P, and others,” states the report. Quite a list of sources from which to obtain the relevant information.

This data was then reviewed at “two levels”, according to the examiner: by lawyers in order to determine which documents were relevant to the investigation and then by subject matter experts in order to understand the implications of the data contained within them. Given the scale of this challenge, it is understandable why there has been such a focus within the regulatory community on establishing living wills legislation in order to ensure that this data is more easily accessible in a timely manner.

Daniel Tarullo, who is a member of the board of governors of the US Federal Reserve System, has been particularly vocal about this subject and the Lehman investigation certainly gives his proposals to determine a list of key data for unwinding purposes legs. After all, it took 70 contract attorneys to conduct the first level review of the Lehman data across its operating, trading, valuation, financial, accounting and other data systems: a significant endeavour indeed.

The lack of integration amongst the systems made the examiner’s job even harder, as well as the fact that at the point in time of the investigation the majority of the systems had been transferred over to Barclays. “Barclays had integrated its own proprietary and confidential data into some of the systems, so Barclays had legitimate concerns about granting access to those systems,” notes the examiner. This meant that some of the data was only available in a “read-only” format, which made the review and organisation of that data much more difficult, says the report.

However, the more significant hurdle was this “patchwork of over 2,600 software systems and applications” across which the data was being held. Instead of learning the ins and outs of each of these systems, the examiner opted to tackle only the “most promising” in terms of finding the correct data and ultimately requested access to 96 of these systems (a mere drop in the data ocean). This was also a problematic process due to the fact that these systems were “arcane, outdated or non-standard”, as well as being “highly interdependent”. The examiner also notes that the relationships between these systems was “difficult to decipher and not well documented”; just imagine what the data management department was facing every day!

As noted recently by Martin Taylor, group chief information officer at LCH.Clearnet, the fact that there was no one left to explain the data or its systems was a challenge in itself. The examiner notes: “Record keeping quickly fell into disarray upon Lehman’s hurried filing. Reconstructing data during this period has proven a challenge not only for the examiner but for all who must rely upon this data in Lehman’s Chapter 11 proceedings.”

As well as providing an insight into the details surrounding the risk management failures of a significant financial institution, the examiner’s report therefore acts as a case in point for the regulatory community with regards to establishing resolution plans. Moreover, it highlights the scale of the data management challenge facing those of a similar size to Lehman. Hopefully it will go some way towards strengthening the case for C-level buy in to the idea of a more structured approach to data.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: An update on data standards and global identifiers

Data standards and global identifiers have been parts of capital markets’ practices for many years, and more are being developed, reviewed and shaped as the industry acknowledges their role in streamlining data management, reducing risk, improving transparency, and achieving compliance. This webinar will discuss data standards and identifiers in play, as well as those in...

BLOG

Webinar Review: How to Develop a Reporting Framework for ESG Disclosure Regulation

Financial institutions have been handed a golden opportunity to grow their business and add value for their customers from an unlikely source – ESG regulators. While banks and investors are under pressure to comply with a slew of new sustainability rules, the process is providing them with a treasure trove of data that can be...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Alternative Trading Systems Directory 2010

The year since we launched our first edition of the A-Team Alternative Trading Directory has passed by in a flash (no pun intended). And while the rate of expansion of the alternative trading system sector may have slowed – even consolidated somewhat – in the more established centres, their onward march continues both in terms of credibility, and of uptake...