The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Fed’s Tarullo Once Again Champions Resolution Plans, BCBS Publishes New Recommendations on Subject: Data Challenges in Spotlight

Regular readers of Reference Data Review should be no strangers to the data and practical implementation challenges of resolution, or living wills, regulations that are the talk of the town at the moment. After all, US Federal Reserve governor Daniel Tarullo is just one of the high profile regulators that has been grandstanding about them for some time and this month has been no different. Adding to Tarullo’s recent speech in New York (more on which later), however, has been the publication of the Basel Committee on Banking Supervision’s (BCBS) report on the subject, including various case studies to highlight the data challenges further.

The Cross-border Bank Resolution Group (CBRG) of the BCBS has produced a new report and recommendations aimed at helping the financial services industry move forward with living wills regulations. The 10 recommendations highlight what the group feels could be used as a framework for the winding down of financial institutions in a cross border environment. Recommendation 6 in particular highlights the data management challenges for systemically important firms inherent in planning in advance for their “orderly resolution”.

The report notes: “The contingency plans for such institutions should address the practical and concrete steps that could be taken in a crisis or wind-down to preserve functional resiliency of essential business operations. A crucial part of such planning is how to ensure access by supervisors to critical information systems with the data necessary to implement those steps.”

This could avoid the regulators and investigators being forced to go through the rigmarole that faced appointed examiner Jenner & Block during the winding down and examination of Lehman Brothers. The examiner was faced with three perabytes, or 350 billion pages, of electronically stored data, which was held across a “patchwork” of “arcane” systems, to process in order to judge the risk management system failings of the investment bank. The Lehman investigation certainly gives the proposals to determine a list of key data for unwinding purposes legs.

The BCBS report also notes that regulators must ensure the “timely production and sharing of needed information both for purposes of contingency planning during normal times and for crisis management and resolution during times of stress”. This lack of proper joined up regulatory data sharing is highlighted as one of the main complications in dealing with the failure of Icelandic bank Kaupthing by the BCBS group.

However, in the case of Fortis, which had to be bailed out by the Dutch government during the crisis, data was shared effectively but different assessments were made by Dutch and Belgian authorities, notes the BCBS report. “Differences in the assessment of available information and the sense of urgency complicated the resolution,” it states. So, not only must data be shared, but communication around what this data means should be stronger between regulators.

One of the foundations of effective resolution is therefore having greater data standardisation of entity and instrument data across the industry. Regulators and market participants alike would then have the ability to better track risk exposure while a firm is still functioning and be better able to carry out funeral arrangements if the worst happens. This has been one of the main points of discussion for the Fed’s Tarullo, who once again this month took the stage at a symposium in New York to champion the data standardisation cause.

Tarullo referred to the recent second anniversary of Bear Stearns’ collapse as a reference point for lessons to be learnt, as well as making the usual references to Lehman’s failure. He spoke again about his goal of a resolution regime for large, interconnected firms, which he called “one of the most important financial regulatory reforms for every country that does not already have such a mechanism in place”. Central to the success of such a plan, however, is the “quality of information” available about these firms, hence data standards come into play.

“A firm would have to inventory all of its legal entities, along with the legal regimes applicable to each one, and map its business lines into legal entities. A firm also would have to document interaffiliate guarantees, funding, hedging, and provision of information technology and other key services. This information would be needed to deal with any crisis, no matter what its specific form,” noted Tarullo.

“Once the centrality of accurate, comprehensive information is understood, it becomes apparent that a very significant upgrade of management information systems (MIS) may be the only way for the firm to satisfy living will requirements, just as we at the Federal Reserve found when we led the Supervisory Capital Assessment Program – popularly known as the bank stress tests – that improved MIS are needed for ongoing risk management at the institution,” he concluded.

Tarullo’s continued championing of this area and the BCBS report are great news for data managers seeking to attain top level buy in to data quality improvement projects. Real challenges exist and the regulatory community is making sure the industry appreciates the seriousness with which they should be dealt with.

Related content


Recorded Webinar: Best practice for Regulatory Change in 2021 and beyond

How to get regulatory change management right and avoid the risks of getting it wrong The burden of regulatory change on financial firms has never been greater, leaving compliance teams under increasing pressure to ensure that changes are reviewed and acted upon in a timely manner. Technology enhancements in this space can help, allowing firms...


Acin Enhances Operational Risk Control Platform

Acin has enhanced its SaaS-based operational risk control platform, adding a new Acin Score facility as well as forward-looking scenario analysis and emerging risk assessment capabilities. The enhancements seek to address financial institutions’ need for a holistic approach to operational risk to mitigate against expensive process failures, financial losses, regulatory penalties and reputational damage. The...


LIVE Briefing: ESG Data Management – A Strategic Imperative

This breakfast briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.


Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...