About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Geithner Testifies on Lehman Failures in Front of US House Financial Services Committee, Again

Subscribe to our newsletter

For those of you that may have been asleep at the back of the room for the last two years, US regulators are doing their level best to hammer home the lessons learned as a result of the failure of Lehman, not least of which is the need to better track entity data across groups. This week it was the turn of US Treasury Secretary Tim Geithner to testify before the House Financial Services Committee on the subject and, again, dwell on the catalogue of difficulties faced by regulators during the unwinding process.

Geithner referred to the critical shortcoming of the system in that the government was unable to “wind-down and liquidate a complex, interconnected financial firm in an orderly way”. He discussed the “profoundly disruptive” process of trying to pick apart a tangled mess of financial data against a background of widespread financial panic. “It magnified the dimensions of the financial crisis, requiring a greater commitment of government resources than might otherwise have been required. Without better tools to wind down firms in an orderly manner, we are left with no good options,” he told government officials.

This is a message that has been endlessly repeated over the last 18 months and has intensified as a result of the release of the examiner’s report on the subject last month. This in turn has led to the inclusion of key data related considerations in the US Financial Reform Legislation Bill, which features something that resembles a data utility to be used for regulatory purposes. Accordingly, it proposes the establishment of an Office of Financial Research to monitor and collect key reference data items from across the industry, including both entity and instrument information.

The Office of Financial Research would include a data centre that would be charged with doing the data collection legwork, namely collecting, validating and maintaining all the required industry data for the monitoring of systemic risk. It is seemingly based on the idea behind the National Institute of Finance (NIF), which has been championed by a number of academics in the US markets and the EDM Council. Much like the NIF, it proposes to establish a data research and analysis centre to monitor systemic risk on behalf of the wider regulatory community.

However, the provisions within the Bill relate to maintaining a data repository for systemic risk measurement at a regulatory level, rather than establishing a global reference data utility. Although regulatory provisions around data standardisation for reporting purposes would drive forward some level of harmonisation across the US market in a general sense, it would fall short of ensuring that all of the firms use these standardised identifiers internally and amongst themselves.

There is also some degree of concern about the fact the US is forging ahead with data standardisation before the rest of the world in this respect. The endeavour could lead to another set of standards being set in place to add to the complexity of the picture overall.

Certainly the issues surrounding data standardisation are important in the current climate of heightened awareness around risk management, but how much should be left up to the market to decide? Does the industry really need or want a central utility to hold essential reference data or are there viable alternatives? Moreover, is data standardisation really a place the regulator needs to tread or should the market leave progress to Darwinian evolution? Should standards be mandatory?

I shall be discussing these topics and many more at tomorrow’s panel discussion on competition and reference data utilities at CorpActions 2010. Check back next week for the lowdown.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practice approaches to data management for regulatory reporting

Effective regulatory reporting requires firms to manage vast amounts of data across multiple systems, regions, and regulatory jurisdictions. With increasing scrutiny from regulators and the rising complexity of financial instruments, the need for a streamlined and strategic approach to data management has never been greater. Financial institutions must ensure accuracy, consistency, and timeliness in their...

BLOG

Key Takeaways from the SEC’s 2025 Examination Priorities

The U.S. Securities and Exchange Commission’s Division of Examinations released its examination priorities for Fiscal Year 2025 in October, placing a strong emphasis on investor protection, market integrity, and compliance with regulatory standards. These priorities, formulated before the results of the 2024 U.S. election were known, reflect insights from past exams and emerging risks, and...

EVENT

TradingTech Briefing New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Risk & Compliance

The current financial climate has meant that risk management and compliance requirements are never far from the minds of the boards of financial institutions. In order to meet the slew of regulations on the horizon, firms are being compelled to invest in their systems in order to cope with the new requirements. Data management is...