About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Geithner Testifies on Lehman Failures in Front of US House Financial Services Committee, Again

Subscribe to our newsletter

For those of you that may have been asleep at the back of the room for the last two years, US regulators are doing their level best to hammer home the lessons learned as a result of the failure of Lehman, not least of which is the need to better track entity data across groups. This week it was the turn of US Treasury Secretary Tim Geithner to testify before the House Financial Services Committee on the subject and, again, dwell on the catalogue of difficulties faced by regulators during the unwinding process.

Geithner referred to the critical shortcoming of the system in that the government was unable to “wind-down and liquidate a complex, interconnected financial firm in an orderly way”. He discussed the “profoundly disruptive” process of trying to pick apart a tangled mess of financial data against a background of widespread financial panic. “It magnified the dimensions of the financial crisis, requiring a greater commitment of government resources than might otherwise have been required. Without better tools to wind down firms in an orderly manner, we are left with no good options,” he told government officials.

This is a message that has been endlessly repeated over the last 18 months and has intensified as a result of the release of the examiner’s report on the subject last month. This in turn has led to the inclusion of key data related considerations in the US Financial Reform Legislation Bill, which features something that resembles a data utility to be used for regulatory purposes. Accordingly, it proposes the establishment of an Office of Financial Research to monitor and collect key reference data items from across the industry, including both entity and instrument information.

The Office of Financial Research would include a data centre that would be charged with doing the data collection legwork, namely collecting, validating and maintaining all the required industry data for the monitoring of systemic risk. It is seemingly based on the idea behind the National Institute of Finance (NIF), which has been championed by a number of academics in the US markets and the EDM Council. Much like the NIF, it proposes to establish a data research and analysis centre to monitor systemic risk on behalf of the wider regulatory community.

However, the provisions within the Bill relate to maintaining a data repository for systemic risk measurement at a regulatory level, rather than establishing a global reference data utility. Although regulatory provisions around data standardisation for reporting purposes would drive forward some level of harmonisation across the US market in a general sense, it would fall short of ensuring that all of the firms use these standardised identifiers internally and amongst themselves.

There is also some degree of concern about the fact the US is forging ahead with data standardisation before the rest of the world in this respect. The endeavour could lead to another set of standards being set in place to add to the complexity of the picture overall.

Certainly the issues surrounding data standardisation are important in the current climate of heightened awareness around risk management, but how much should be left up to the market to decide? Does the industry really need or want a central utility to hold essential reference data or are there viable alternatives? Moreover, is data standardisation really a place the regulator needs to tread or should the market leave progress to Darwinian evolution? Should standards be mandatory?

I shall be discussing these topics and many more at tomorrow’s panel discussion on competition and reference data utilities at CorpActions 2010. Check back next week for the lowdown.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: An update on data standards and global identifiers

Data standards and global identifiers have been parts of capital markets’ practices for many years, and more are being developed, reviewed and shaped as the industry acknowledges their role in streamlining data management, reducing risk, improving transparency, and achieving compliance. This webinar will discuss data standards and identifiers in play, as well as those in...

BLOG

Best Practice Approaches to Trade Surveillance for Market Abuse

In 2023, Openmarkets Australia was fined the largest ever penalty imposed by the?Australian Securities and Investments Commission (ASIC) of $4.5 million. Among other observations, the regulators noted that Openmarkets had not appropriately calibrated its post-trade surveillance system and that this resulted in an unmanageable volume of alerts, most of which were not reviewed.  “This outcome...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...