The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Geithner Testifies on Lehman Failures in Front of US House Financial Services Committee, Again

Share article

For those of you that may have been asleep at the back of the room for the last two years, US regulators are doing their level best to hammer home the lessons learned as a result of the failure of Lehman, not least of which is the need to better track entity data across groups. This week it was the turn of US Treasury Secretary Tim Geithner to testify before the House Financial Services Committee on the subject and, again, dwell on the catalogue of difficulties faced by regulators during the unwinding process.

Geithner referred to the critical shortcoming of the system in that the government was unable to “wind-down and liquidate a complex, interconnected financial firm in an orderly way”. He discussed the “profoundly disruptive” process of trying to pick apart a tangled mess of financial data against a background of widespread financial panic. “It magnified the dimensions of the financial crisis, requiring a greater commitment of government resources than might otherwise have been required. Without better tools to wind down firms in an orderly manner, we are left with no good options,” he told government officials.

This is a message that has been endlessly repeated over the last 18 months and has intensified as a result of the release of the examiner’s report on the subject last month. This in turn has led to the inclusion of key data related considerations in the US Financial Reform Legislation Bill, which features something that resembles a data utility to be used for regulatory purposes. Accordingly, it proposes the establishment of an Office of Financial Research to monitor and collect key reference data items from across the industry, including both entity and instrument information.

The Office of Financial Research would include a data centre that would be charged with doing the data collection legwork, namely collecting, validating and maintaining all the required industry data for the monitoring of systemic risk. It is seemingly based on the idea behind the National Institute of Finance (NIF), which has been championed by a number of academics in the US markets and the EDM Council. Much like the NIF, it proposes to establish a data research and analysis centre to monitor systemic risk on behalf of the wider regulatory community.

However, the provisions within the Bill relate to maintaining a data repository for systemic risk measurement at a regulatory level, rather than establishing a global reference data utility. Although regulatory provisions around data standardisation for reporting purposes would drive forward some level of harmonisation across the US market in a general sense, it would fall short of ensuring that all of the firms use these standardised identifiers internally and amongst themselves.

There is also some degree of concern about the fact the US is forging ahead with data standardisation before the rest of the world in this respect. The endeavour could lead to another set of standards being set in place to add to the complexity of the picture overall.

Certainly the issues surrounding data standardisation are important in the current climate of heightened awareness around risk management, but how much should be left up to the market to decide? Does the industry really need or want a central utility to hold essential reference data or are there viable alternatives? Moreover, is data standardisation really a place the regulator needs to tread or should the market leave progress to Darwinian evolution? Should standards be mandatory?

I shall be discussing these topics and many more at tomorrow’s panel discussion on competition and reference data utilities at CorpActions 2010. Check back next week for the lowdown.

Related content

WEBINAR

Recorded Webinar: Best Practices for Integrated Regulatory Reporting Across Multiple Jurisdictions

The regulatory reporting obligations of financial institutions have mushroomed in scale over the past decade, leaving firms facing a raft of different requirements to provide increasingly granular metrics on their transaction, valuation and collateral data to a number of regulatory authorities. While many of these reports draw from the same core data set, the nuanced differences...

BLOG

Wolters Kluwer and Vizor Software’s Singapore Partnership Aligns Regulator and Financial Services Software for Improved Regulatory Reporting

Wolters Kluwer’s Finance, Risk & Reporting (FRR) business and Vizor Software have teamed up to work on a partnership for the Singapore market that will integrate the Vizor Reporting API into Wolters Kluwer’s OneSumX platform for regulatory reporting, allowing it to automatically consume published machine-readable regulatory rules and data models directly from Singapore’s regulatory system. Vizor Software...

EVENT

TradingTech Summit London

The TradingTech Summit in London brings together European senior-level decision makers in trading technology, electronic execution and trading architecture to discuss how firms can use high performance technologies to optimise trading in the new regulatory environment.

GUIDE

Entity Data Management Handbook – Sixth Edition

High-profile and punitive penalties handed out to large financial institutions for non-compliance with Anti-Money Laundering (AML) and Know Your Customer (KYC) regulations have catapulted entity data management up the business agenda. So, too, have industry and government reports on the staggering sums of money laundered on a global basis. Less apparent, but equally important, are...