About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Lehman Investigation Indicates Immense Scale of the Data Challenge Due to 350bn Pages of Data and “Arcane Systems”

Subscribe to our newsletter

The recently published examiner report into the Lehman bankruptcy indicates the scale of the data challenge faced when winding down a financial institution of its size: the examiner was faced with three perabytes (otherwise known as 350 billion pages) of electronically stored data to process. Unsurprisingly, given the fact that information needed to be presented before the end of the next century, the examiner was only able to collect and process five million of these documents (around 40,000,000 pages, or 0.01% of the total number of pages). This challenge was further exacerbated by the storage of this data on “arcane, outdated or non-standard” systems, said the report by Anton Valukas of Jenner & Block.

“The examiner carefully selected a group of document custodians and search terms designed to cull out the most promising subset of Lehman electronic materials for review. In addition, the examiner requested and received hard copy documents from Lehman and both electronic and hard copy documents from numerous third parties and government agencies, including the Department of the Treasury, the Securities and Exchange Commission (SEC), the Federal Reserve, FRBNY, the Office of Thrift Supervision, the SIPA Trustee, Ernst & Young, JPMorgan, Barclays, Bank of America, HSBC, Citibank, Fitch, Moody’s, S&P, and others,” states the report. Quite a list of sources from which to obtain the relevant information.

This data was then reviewed at “two levels”, according to the examiner: by lawyers in order to determine which documents were relevant to the investigation and then by subject matter experts in order to understand the implications of the data contained within them. Given the scale of this challenge, it is understandable why there has been such a focus within the regulatory community on establishing living wills legislation in order to ensure that this data is more easily accessible in a timely manner.

Daniel Tarullo, who is a member of the board of governors of the US Federal Reserve System, has been particularly vocal about this subject and the Lehman investigation certainly gives his proposals to determine a list of key data for unwinding purposes legs. After all, it took 70 contract attorneys to conduct the first level review of the Lehman data across its operating, trading, valuation, financial, accounting and other data systems: a significant endeavour indeed.

The lack of integration amongst the systems made the examiner’s job even harder, as well as the fact that at the point in time of the investigation the majority of the systems had been transferred over to Barclays. “Barclays had integrated its own proprietary and confidential data into some of the systems, so Barclays had legitimate concerns about granting access to those systems,” notes the examiner. This meant that some of the data was only available in a “read-only” format, which made the review and organisation of that data much more difficult, says the report.

However, the more significant hurdle was this “patchwork of over 2,600 software systems and applications” across which the data was being held. Instead of learning the ins and outs of each of these systems, the examiner opted to tackle only the “most promising” in terms of finding the correct data and ultimately requested access to 96 of these systems (a mere drop in the data ocean). This was also a problematic process due to the fact that these systems were “arcane, outdated or non-standard”, as well as being “highly interdependent”. The examiner also notes that the relationships between these systems was “difficult to decipher and not well documented”; just imagine what the data management department was facing every day!

As noted recently by Martin Taylor, group chief information officer at LCH.Clearnet, the fact that there was no one left to explain the data or its systems was a challenge in itself. The examiner notes: “Record keeping quickly fell into disarray upon Lehman’s hurried filing. Reconstructing data during this period has proven a challenge not only for the examiner but for all who must rely upon this data in Lehman’s Chapter 11 proceedings.”

As well as providing an insight into the details surrounding the risk management failures of a significant financial institution, the examiner’s report therefore acts as a case in point for the regulatory community with regards to establishing resolution plans. Moreover, it highlights the scale of the data management challenge facing those of a similar size to Lehman. Hopefully it will go some way towards strengthening the case for C-level buy in to the idea of a more structured approach to data.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Deploying regtech solutions for optimal compliance professional performance

Today’s atmosphere of heightened regulatory scrutiny is focusing attention on compliance departments and the tools available to mitigate the risk of regulatory censure. Much has been made of emerging regtech solutions’ ability to offer new capabilities, potentially changing the way compliance teams work. But what has been the true impact of regtech on the compliance...

BLOG

Data’s Evolution Continues From Cost to Core Asset: DMS New York City 2025 Preview

Modern Chief Data Officers are not only the guardians of financial institutions’ data estates, they are also the caretakers of their single-biggest asset. With every part of an organisation’s business now dependent on data, the custody of its digital information is every bit as critical to operations as the management of trading teams or even...

EVENT

Eagle Alpha Alternative Data Conference, London, hosted by A-Team Group

Now in its 8th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

The Global LEI System – Slow but Sure

After what looked like a slow start to the summer, the initiative to establish a global standard for legal entity identifiers (LEIs) took a series of significant leaps forward during August, that appears to have put the project firmly back on track. If the marketplace felt a little reticent in June and July, it could...