About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Geithner Testifies on Lehman Failures in Front of US House Financial Services Committee, Again

Subscribe to our newsletter

For those of you that may have been asleep at the back of the room for the last two years, US regulators are doing their level best to hammer home the lessons learned as a result of the failure of Lehman, not least of which is the need to better track entity data across groups. This week it was the turn of US Treasury Secretary Tim Geithner to testify before the House Financial Services Committee on the subject and, again, dwell on the catalogue of difficulties faced by regulators during the unwinding process.

Geithner referred to the critical shortcoming of the system in that the government was unable to “wind-down and liquidate a complex, interconnected financial firm in an orderly way”. He discussed the “profoundly disruptive” process of trying to pick apart a tangled mess of financial data against a background of widespread financial panic. “It magnified the dimensions of the financial crisis, requiring a greater commitment of government resources than might otherwise have been required. Without better tools to wind down firms in an orderly manner, we are left with no good options,” he told government officials.

This is a message that has been endlessly repeated over the last 18 months and has intensified as a result of the release of the examiner’s report on the subject last month. This in turn has led to the inclusion of key data related considerations in the US Financial Reform Legislation Bill, which features something that resembles a data utility to be used for regulatory purposes. Accordingly, it proposes the establishment of an Office of Financial Research to monitor and collect key reference data items from across the industry, including both entity and instrument information.

The Office of Financial Research would include a data centre that would be charged with doing the data collection legwork, namely collecting, validating and maintaining all the required industry data for the monitoring of systemic risk. It is seemingly based on the idea behind the National Institute of Finance (NIF), which has been championed by a number of academics in the US markets and the EDM Council. Much like the NIF, it proposes to establish a data research and analysis centre to monitor systemic risk on behalf of the wider regulatory community.

However, the provisions within the Bill relate to maintaining a data repository for systemic risk measurement at a regulatory level, rather than establishing a global reference data utility. Although regulatory provisions around data standardisation for reporting purposes would drive forward some level of harmonisation across the US market in a general sense, it would fall short of ensuring that all of the firms use these standardised identifiers internally and amongst themselves.

There is also some degree of concern about the fact the US is forging ahead with data standardisation before the rest of the world in this respect. The endeavour could lead to another set of standards being set in place to add to the complexity of the picture overall.

Certainly the issues surrounding data standardisation are important in the current climate of heightened awareness around risk management, but how much should be left up to the market to decide? Does the industry really need or want a central utility to hold essential reference data or are there viable alternatives? Moreover, is data standardisation really a place the regulator needs to tread or should the market leave progress to Darwinian evolution? Should standards be mandatory?

I shall be discussing these topics and many more at tomorrow’s panel discussion on competition and reference data utilities at CorpActions 2010. Check back next week for the lowdown.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to automate entity data management and due diligence to ensure efficiency, accuracy and compliance

Requesting, gathering, analysing and monitoring customer, vendor and partner entity data is time consuming, and often a tedious manual process. This can slow down customer relationships and expose financial institutions to risk from inaccurate, incomplete or outdated data – but there are solutions to these problems. This webinar will consider the challenges of sourcing and...

BLOG

Top 19 e-Comms Surveillance Providers 2024

Earlier this month, the Securities and Exchange Commission announced charges against a registered investment adviser for widespread and longstanding failures to maintain and preserve certain electronic communications; the matter was settled for $6.5 million. This case is notable because it is the first time the SEC has brought charges against an RIA with no ties...

EVENT

RegTech Summit London

Now in its 8th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...