About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Obama Signs Off US Financial Services Bill, But What Changes are Needed to Meet the OFR’s Data Requirements?

Subscribe to our newsletter

After months of debate and the lengthy deliberations of a dedicated conference committee, US president Barack Obama has finally signed off the reform bill that proposes, amongst many other things, to set up a US-based data utility in order to more accurately monitor systemic risk. The establishment of the Office of Financial Research has now passed into law and, for better or worse, the government and the market needs to determine exactly what the endeavour will entail. What’s for certain is that data standardisation debates and reporting requirements for basic reference data sets are headed the industry’s way.

Reference Data Review has been tracking the proposals and the community’s perceptions about a data utility on both sides of the pond for some time. What started out as a campaign to set up a national repository of financial transaction and entity position data, initially dubbed the National Institute of Finance (NIF) last year, has managed to seemingly sneak its way into national law, with the help of a choice few academics. Allan Mendelowitz, a director at the Federal Housing Finance Board and a founding member of the Committee to Establish the NIF, first tabled the notion at a symposium on systemic risk in Washington in June 2009 and the rest, as they say, is legislative history.

But now that it has been passed, albeit amended to focus only on systemically important institutions (those with total consolidated assets of US$50 billion or more), what next?

The industry’s reaction thus far has been rather muted: most industry participants that we have spoken to at recent events seem underwhelmed at the prospect and feel the set up of such a large scale project will take a significant amount of time and investment. After all, details are scant about how the new utility will be established, let alone operate, and there are only rough cost estimates on the table, which many feel are significant under-estimates.

The cost impact on individual firms has also yet to be assessed, but it is likely to be close to that of other regulatory reporting projects of a similar nature. The focus will be initially on establishing cross references between a firm’s internal data formats and those required by the regulator for reporting purposes. It is highly unlikely a rip and replace approach will be adopted by many (if at all) because of the extensive use of existing entity and instrument data standards and the fact they are structurally embedded into many of firms’ data processes and workflows. Any benefit to the industry of this standardisation push is therefore likely to be a long time coming.

The Office of Financial Research will publish publically: a financial company reference database; an instrument reference database; and formats and standards for reporting of transaction and position data to the utility. However, the latter is needed before any serious project work can begin within firms themselves and there is, as yet, no indication of when any such information will be provided. A timeline for the overall project is needed, and fast.

As remarked upon by a data manager at a key European bank to Reference Data Review recently, the establishment of the Office of Financial Research will also likely prove to be a “license to print money” for any vendor that is enlisted to participate in its set-up. Vendors are already lining up as potential candidates.

He recommended that rather than letting the vendor and regulatory community take charge, firms should instead adopt a more proactive approach to the changes and actively provide feedback to those charged with setting up the utility. Sitting back and waiting for a regulatory defined standard to be produced is a very dangerous attitude to adopt indeed.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to leverage Generative AI and Large Language Models for regulatory compliance

Generative AI (GenAI) and Large Language Models (LLMs) offer huge potential for change across capital markets, not least in regulatory compliance where they have the capability to help firms understand and interpret regulations, automate compliance, monitor transactions in real time, and flag anomalies in the same timeframe. They also present challenges including explainability, responsibility, model...

BLOG

Corlytics Reports Eye-Watering Fines for 2023 Regulatory Breaches

Corlytics, a provider of regulatory risk intelligence, has released an enforcement data report for 2023 revealing financial crime, data protection, and governance as the main risk categories for financial services with the highest penalties. Some $6.7 billion of fines were imposed for financial crime, most of which were for money laundering and terrorist financing. Looking...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

FATCA – The Time to Act is Now

The US Foreign Account Tax Compliance Act – aka FATCA – raised eyebrows when its final regulations requiring foreign financial institutions (FFIs) to report US accounts to US tax authorities were published last year. But with the exception of a few modifications, the legislation remains in place and starts to comes into force in earnest...