The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

NIF Off to a Slow Start with Only 44 Signatures: Is the US Convinced it’s the Answer to Reference Data Standardisation?

Share article

As the industry discussions continue around the European Central Bank’s proposed reference data utility, across the pond another similar campaign is attempting to drum up support for its own data standardisation initiative. The Committee to Establish the National Institute of Finance is canvassing for industry participants to sign its petition to launch (you’ve guessed it) the National Institute of Finance (NIF), which it says will maintain a national repository of financial transaction and entity position data.

The petition has 44 signatures so far and most individuals on the list seem to be rather reticent to cite their job titles and institution names. Of those that are visible, the academic community dominates, with a smattering of signatures from the vendor community, including SunGard, GoldenSource, Datagenic and CME Group, and the banking community, including Standard Bank of South Africa, Morgan Stanley and Bank of New York Mellon. Not the most auspicious start, but it is early days and a lot of the campaign work is still to come.

The EDM Council has been a champion of the NIF proposals in the same vein as its support for the ECB’s reference data utility ideas. The US initiative goes one step further than the ECB’s ambitions, however, because the NIF is seeking to provide the analytical capabilities to the market to be able to deal with the data it is providing.

A key proponent of the proposals is Allan Mendelowitz, a director at the Federal Housing Finance Board and a founding member of the Committee to Establish the NIF, who first tabled the notion at a symposium on systemic risk in Washington in June. He explains that the initiative is aimed at tackling the data inefficiencies and gaps in the market that prevent regulators and policymakers from implementing regulatory reform. Mendelowitz says that a “serious and detailed examination of the significant gaps in the informational infrastructure” available to these bodies is needed and contends that the NIF is the appropriate body to do this.

“The NIF would have the authority to gather appropriate data and provide the analytical capabilities necessary to monitor systemic risk, to perform independent risk assessments of individual financial entities, and to provide advice on the financial system to the Federal regulatory agencies and the United States Congress. The NIF would be a government entity and resource for the regulatory community and would not overlap or duplicate existing or proposed oversight and enforcement responsibilities of existing regulatory agencies,” he explains.

The body would be funded via “non-appropriated funds by assessments placed on data reporting entities” and would comprise of two components: a Federal Financial Data Centre (FFDC) and a Federal Financial Research and Analysis Centre (FFRAC). The FFDC would collect, clean, maintain and secure data including financial transactions data, positions, holdings, obligations and any other data deemed important for systemic analysis. All US-based financial entities would be required to report these data to the FFDC and other entities would be required to report such data for their US-based activities.

Accordingly, the NIF would need to be supported by a regulatory framework in order to compel such institutions to provide this data. The FFDC would also be faced with significant challenge of developing, maintaining and providing the reference data, identification codes and reporting standards for financial institutions to use when providing the data. This would entail the maintenance of a classification system for financial products, additional to those used by all other players in the market.

The FFDC is therefore similar in practical implementation terms to the ECB’s ambitions for the reference data world, but the FFRAC is something different entirely. It is one step further into the reference data space and would involve providing independent analytical capabilities and computing resources to the regulatory community in order to facilitate turning the data into something useful. This is something that the vendor community has been engaged in for some time for the private sector and may therefore view as a potential competitive threat.

According to Mendelowitz, the FFRAC would support the development of metrics and systems to monitor and report on systemic risk levels and patterns. It would conduct, coordinate and sponsor research to support regulatory policy and continuously improve systemic risk regulation and then incorporate the appropriate results into production level risk management tools.

The FFRAC would also dabble in the vendor agenda by maintaining and promulgating standards for software verification and validation for software used by a financial entity for reporting valuations, risk, or results from stress tests. This is, again, a highly sensitive area that may provoke a backlash from the data provider community. Regulators would be using the FFRAC as a testing tool for software offerings, in terms of valuations, risk management and stress testing software.

The addition of another federal agency into a regulatory environment that already has a surplus, and the problem of frequent turf wars as a result, may also prove controversial. For now, Reference Data Review will be closely monitoring the industry response to the proposals (along with the ECB’s proposals). If you have an opinion on the NIF’s analytics ideas in particular, drop us a line.

Related content

WEBINAR

Upcoming Webinar: Moving Regulatory Data to the Cloud: A Use Case Discussion

Date: 20 October 2020 Time: 11:00am ET / 4:00pm London / 5:00pm CET Migrating risk and regulatory reporting data to the cloud is turning out to be one of the hottest trends for 2020 – but not everyone is getting it right, and there are pitfalls to be avoided as well positive outcomes to be...

BLOG

Debut Virtual Data Management Summit Highlights Industry Agility

In an extraordinary achievement that attracted over 1,000 delegates and well over 20,000 views, A-Team Group’s debut Virtual Data Management Summit went live last week to showcase the most cutting-edge content from an industry that has shown itself to be both resilient and responsive to the unique challenges presented by the ongoing coronavirus epidemic. Focusing...

EVENT

Data Management Summit USA Virtual

Data Management Summit USA Virtual will explore how sell side and buy side financial institutions are navigating the global crisis and adapting their data strategies to manage in today’s new normal environment.

GUIDE

Entity Data Management Handbook – Sixth Edition

High-profile and punitive penalties handed out to large financial institutions for non-compliance with Anti-Money Laundering (AML) and Know Your Customer (KYC) regulations have catapulted entity data management up the business agenda. So, too, have industry and government reports on the staggering sums of money laundered on a global basis. Less apparent, but equally important, are...