About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Geithner Testifies on Lehman Failures in Front of US House Financial Services Committee, Again

Subscribe to our newsletter

For those of you that may have been asleep at the back of the room for the last two years, US regulators are doing their level best to hammer home the lessons learned as a result of the failure of Lehman, not least of which is the need to better track entity data across groups. This week it was the turn of US Treasury Secretary Tim Geithner to testify before the House Financial Services Committee on the subject and, again, dwell on the catalogue of difficulties faced by regulators during the unwinding process.

Geithner referred to the critical shortcoming of the system in that the government was unable to “wind-down and liquidate a complex, interconnected financial firm in an orderly way”. He discussed the “profoundly disruptive” process of trying to pick apart a tangled mess of financial data against a background of widespread financial panic. “It magnified the dimensions of the financial crisis, requiring a greater commitment of government resources than might otherwise have been required. Without better tools to wind down firms in an orderly manner, we are left with no good options,” he told government officials.

This is a message that has been endlessly repeated over the last 18 months and has intensified as a result of the release of the examiner’s report on the subject last month. This in turn has led to the inclusion of key data related considerations in the US Financial Reform Legislation Bill, which features something that resembles a data utility to be used for regulatory purposes. Accordingly, it proposes the establishment of an Office of Financial Research to monitor and collect key reference data items from across the industry, including both entity and instrument information.

The Office of Financial Research would include a data centre that would be charged with doing the data collection legwork, namely collecting, validating and maintaining all the required industry data for the monitoring of systemic risk. It is seemingly based on the idea behind the National Institute of Finance (NIF), which has been championed by a number of academics in the US markets and the EDM Council. Much like the NIF, it proposes to establish a data research and analysis centre to monitor systemic risk on behalf of the wider regulatory community.

However, the provisions within the Bill relate to maintaining a data repository for systemic risk measurement at a regulatory level, rather than establishing a global reference data utility. Although regulatory provisions around data standardisation for reporting purposes would drive forward some level of harmonisation across the US market in a general sense, it would fall short of ensuring that all of the firms use these standardised identifiers internally and amongst themselves.

There is also some degree of concern about the fact the US is forging ahead with data standardisation before the rest of the world in this respect. The endeavour could lead to another set of standards being set in place to add to the complexity of the picture overall.

Certainly the issues surrounding data standardisation are important in the current climate of heightened awareness around risk management, but how much should be left up to the market to decide? Does the industry really need or want a central utility to hold essential reference data or are there viable alternatives? Moreover, is data standardisation really a place the regulator needs to tread or should the market leave progress to Darwinian evolution? Should standards be mandatory?

I shall be discussing these topics and many more at tomorrow’s panel discussion on competition and reference data utilities at CorpActions 2010. Check back next week for the lowdown.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practice approaches to data management for regulatory reporting

Effective regulatory reporting requires firms to manage vast amounts of data across multiple systems, regions, and regulatory jurisdictions. With increasing scrutiny from regulators and the rising complexity of financial instruments, the need for a streamlined and strategic approach to data management has never been greater. Financial institutions must ensure accuracy, consistency, and timeliness in their...

BLOG

REP008, FIT, and Beyond: Navigating the FCA’s Reporting Duties on Misconduct

The Financial Conduct Authority (FCA) has long insisted that “non-financial misconduct is misconduct.” That phrase, repeated across speeches and policy statements, reflects the regulator’s conviction that culture, integrity, and behaviour are inseparable from financial soundness. In 2025, the FCA translated that principle into formal rulemaking, finalising changes to the Senior Managers & Certification Regime (SMCR)...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...