The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

NIF Off to a Slow Start with Only 44 Signatures: Is the US Convinced it’s the Answer to Reference Data Standardisation?

Share article

As the industry discussions continue around the European Central Bank’s proposed reference data utility, across the pond another similar campaign is attempting to drum up support for its own data standardisation initiative. The Committee to Establish the National Institute of Finance is canvassing for industry participants to sign its petition to launch (you’ve guessed it) the National Institute of Finance (NIF), which it says will maintain a national repository of financial transaction and entity position data.

The petition has 44 signatures so far and most individuals on the list seem to be rather reticent to cite their job titles and institution names. Of those that are visible, the academic community dominates, with a smattering of signatures from the vendor community, including SunGard, GoldenSource, Datagenic and CME Group, and the banking community, including Standard Bank of South Africa, Morgan Stanley and Bank of New York Mellon. Not the most auspicious start, but it is early days and a lot of the campaign work is still to come.

The EDM Council has been a champion of the NIF proposals in the same vein as its support for the ECB’s reference data utility ideas. The US initiative goes one step further than the ECB’s ambitions, however, because the NIF is seeking to provide the analytical capabilities to the market to be able to deal with the data it is providing.

A key proponent of the proposals is Allan Mendelowitz, a director at the Federal Housing Finance Board and a founding member of the Committee to Establish the NIF, who first tabled the notion at a symposium on systemic risk in Washington in June. He explains that the initiative is aimed at tackling the data inefficiencies and gaps in the market that prevent regulators and policymakers from implementing regulatory reform. Mendelowitz says that a “serious and detailed examination of the significant gaps in the informational infrastructure” available to these bodies is needed and contends that the NIF is the appropriate body to do this.

“The NIF would have the authority to gather appropriate data and provide the analytical capabilities necessary to monitor systemic risk, to perform independent risk assessments of individual financial entities, and to provide advice on the financial system to the Federal regulatory agencies and the United States Congress. The NIF would be a government entity and resource for the regulatory community and would not overlap or duplicate existing or proposed oversight and enforcement responsibilities of existing regulatory agencies,” he explains.

The body would be funded via “non-appropriated funds by assessments placed on data reporting entities” and would comprise of two components: a Federal Financial Data Centre (FFDC) and a Federal Financial Research and Analysis Centre (FFRAC). The FFDC would collect, clean, maintain and secure data including financial transactions data, positions, holdings, obligations and any other data deemed important for systemic analysis. All US-based financial entities would be required to report these data to the FFDC and other entities would be required to report such data for their US-based activities.

Accordingly, the NIF would need to be supported by a regulatory framework in order to compel such institutions to provide this data. The FFDC would also be faced with significant challenge of developing, maintaining and providing the reference data, identification codes and reporting standards for financial institutions to use when providing the data. This would entail the maintenance of a classification system for financial products, additional to those used by all other players in the market.

The FFDC is therefore similar in practical implementation terms to the ECB’s ambitions for the reference data world, but the FFRAC is something different entirely. It is one step further into the reference data space and would involve providing independent analytical capabilities and computing resources to the regulatory community in order to facilitate turning the data into something useful. This is something that the vendor community has been engaged in for some time for the private sector and may therefore view as a potential competitive threat.

According to Mendelowitz, the FFRAC would support the development of metrics and systems to monitor and report on systemic risk levels and patterns. It would conduct, coordinate and sponsor research to support regulatory policy and continuously improve systemic risk regulation and then incorporate the appropriate results into production level risk management tools.

The FFRAC would also dabble in the vendor agenda by maintaining and promulgating standards for software verification and validation for software used by a financial entity for reporting valuations, risk, or results from stress tests. This is, again, a highly sensitive area that may provoke a backlash from the data provider community. Regulators would be using the FFRAC as a testing tool for software offerings, in terms of valuations, risk management and stress testing software.

The addition of another federal agency into a regulatory environment that already has a surplus, and the problem of frequent turf wars as a result, may also prove controversial. For now, Reference Data Review will be closely monitoring the industry response to the proposals (along with the ECB’s proposals). If you have an opinion on the NIF’s analytics ideas in particular, drop us a line.

Related content

WEBINAR

Recorded Webinar: How to run effective client onboarding and KYC processes

Increasing cost, complexity and regulatory change continue to challenge firms implementing client onboarding and Know Your Customer (KYC) systems. With an effective strategy and a clearly defined pathway, it’s possible to gain a valuable competitive advantage whilst meeting those all-important compliance requirements. But how to get there? With a myriad of different options out there...

BLOG

Global LEI Foundation Aims to Accelerate LEI Adoption by Slashing Prices of the Identifiers to Single Digit Dollars

The Global LEI Foundation (GLEIF) is planning to tear down the cost barrier obstructing widespread adoption of the LEI with the implementation of a validation agent role for banks and financial institutions. Expectations are that frequently contended high prices charged to register an entity for an LEI – by way of example Bloomberg charges $65...

EVENT

Data Management Summit London

Now in its 11th year, the Data Management Summit (DMS) in London, will explore how financial institutions are adapting their data strategies to capitalise and support revenue generating activities and operational efficiency in today's digital and cloud based environment. Join us to hear from leading data practitioners and innovators who will share insights into how they are pushing the boundaries with data and delivering value through data and analytics.

GUIDE

RegTech Suppliers Guide 2020/2021

Welcome to the second edition of A-Team Group’s RegTech Suppliers Guide, an essential aid for financial institutions sourcing innovative solutions to improve their regulatory response, and a showcase for encumbent and new RegTech vendors with offerings designed to match market demand. Available free of charge and based on an industry-wide survey, the guide provides a...