About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Obama Signs Off US Financial Services Bill, But What Changes are Needed to Meet the OFR’s Data Requirements?

Subscribe to our newsletter

After months of debate and the lengthy deliberations of a dedicated conference committee, US president Barack Obama has finally signed off the reform bill that proposes, amongst many other things, to set up a US-based data utility in order to more accurately monitor systemic risk. The establishment of the Office of Financial Research has now passed into law and, for better or worse, the government and the market needs to determine exactly what the endeavour will entail. What’s for certain is that data standardisation debates and reporting requirements for basic reference data sets are headed the industry’s way.

Reference Data Review has been tracking the proposals and the community’s perceptions about a data utility on both sides of the pond for some time. What started out as a campaign to set up a national repository of financial transaction and entity position data, initially dubbed the National Institute of Finance (NIF) last year, has managed to seemingly sneak its way into national law, with the help of a choice few academics. Allan Mendelowitz, a director at the Federal Housing Finance Board and a founding member of the Committee to Establish the NIF, first tabled the notion at a symposium on systemic risk in Washington in June 2009 and the rest, as they say, is legislative history.

But now that it has been passed, albeit amended to focus only on systemically important institutions (those with total consolidated assets of US$50 billion or more), what next?

The industry’s reaction thus far has been rather muted: most industry participants that we have spoken to at recent events seem underwhelmed at the prospect and feel the set up of such a large scale project will take a significant amount of time and investment. After all, details are scant about how the new utility will be established, let alone operate, and there are only rough cost estimates on the table, which many feel are significant under-estimates.

The cost impact on individual firms has also yet to be assessed, but it is likely to be close to that of other regulatory reporting projects of a similar nature. The focus will be initially on establishing cross references between a firm’s internal data formats and those required by the regulator for reporting purposes. It is highly unlikely a rip and replace approach will be adopted by many (if at all) because of the extensive use of existing entity and instrument data standards and the fact they are structurally embedded into many of firms’ data processes and workflows. Any benefit to the industry of this standardisation push is therefore likely to be a long time coming.

The Office of Financial Research will publish publically: a financial company reference database; an instrument reference database; and formats and standards for reporting of transaction and position data to the utility. However, the latter is needed before any serious project work can begin within firms themselves and there is, as yet, no indication of when any such information will be provided. A timeline for the overall project is needed, and fast.

As remarked upon by a data manager at a key European bank to Reference Data Review recently, the establishment of the Office of Financial Research will also likely prove to be a “license to print money” for any vendor that is enlisted to participate in its set-up. Vendors are already lining up as potential candidates.

He recommended that rather than letting the vendor and regulatory community take charge, firms should instead adopt a more proactive approach to the changes and actively provide feedback to those charged with setting up the utility. Sitting back and waiting for a regulatory defined standard to be produced is a very dangerous attitude to adopt indeed.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: GenAI and LLM case studies for Surveillance, Screening and Scanning

As Generative AI (GenAI) and Large Language Models (LLMs) move from pilot to production, compliance, surveillance, and screening functions are seeing tangible results – and new risks. From trade surveillance to adverse media screening to policy and regulatory scanning, GenAI and LLMs promise to tackle complexity and volume at a scale never seen before. But...

BLOG

Re-Architecting Regulatory Reporting with REGnosys and Open Source

Regulatory reporting has long been defined by highly specialized jurisdictional knowledge, templates, spreadsheets, and a significant part of the compliance budget. Regulators publish new requirements, firms interpret them independently, technology teams build extraction and transformation layers, and operations teams reconcile outputs before pushing formatted datasets to supervisory authorities. RegTech Insight sat down with regulatory reporting...

EVENT

Eagle Alpha Alternative Data Conference, Spring, New York, hosted by A-Team Group

Now in its 9th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

Solvency II Data Management Handbook

Want to get a handle on Solvency II and what it means for data management? Need to make sure you have all the bases covered for the looming January 2016 deadline? Our Solvency II Data Management Handbook is now available for free download to help you. This Handbook is the ultimate guide to all things...