About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

NIF Off to a Slow Start with Only 44 Signatures: Is the US Convinced it’s the Answer to Reference Data Standardisation?

Subscribe to our newsletter

As the industry discussions continue around the European Central Bank’s proposed reference data utility, across the pond another similar campaign is attempting to drum up support for its own data standardisation initiative. The Committee to Establish the National Institute of Finance is canvassing for industry participants to sign its petition to launch (you’ve guessed it) the National Institute of Finance (NIF), which it says will maintain a national repository of financial transaction and entity position data.

The petition has 44 signatures so far and most individuals on the list seem to be rather reticent to cite their job titles and institution names. Of those that are visible, the academic community dominates, with a smattering of signatures from the vendor community, including SunGard, GoldenSource, Datagenic and CME Group, and the banking community, including Standard Bank of South Africa, Morgan Stanley and Bank of New York Mellon. Not the most auspicious start, but it is early days and a lot of the campaign work is still to come.

The EDM Council has been a champion of the NIF proposals in the same vein as its support for the ECB’s reference data utility ideas. The US initiative goes one step further than the ECB’s ambitions, however, because the NIF is seeking to provide the analytical capabilities to the market to be able to deal with the data it is providing.

A key proponent of the proposals is Allan Mendelowitz, a director at the Federal Housing Finance Board and a founding member of the Committee to Establish the NIF, who first tabled the notion at a symposium on systemic risk in Washington in June. He explains that the initiative is aimed at tackling the data inefficiencies and gaps in the market that prevent regulators and policymakers from implementing regulatory reform. Mendelowitz says that a “serious and detailed examination of the significant gaps in the informational infrastructure” available to these bodies is needed and contends that the NIF is the appropriate body to do this.

“The NIF would have the authority to gather appropriate data and provide the analytical capabilities necessary to monitor systemic risk, to perform independent risk assessments of individual financial entities, and to provide advice on the financial system to the Federal regulatory agencies and the United States Congress. The NIF would be a government entity and resource for the regulatory community and would not overlap or duplicate existing or proposed oversight and enforcement responsibilities of existing regulatory agencies,” he explains.

The body would be funded via “non-appropriated funds by assessments placed on data reporting entities” and would comprise of two components: a Federal Financial Data Centre (FFDC) and a Federal Financial Research and Analysis Centre (FFRAC). The FFDC would collect, clean, maintain and secure data including financial transactions data, positions, holdings, obligations and any other data deemed important for systemic analysis. All US-based financial entities would be required to report these data to the FFDC and other entities would be required to report such data for their US-based activities.

Accordingly, the NIF would need to be supported by a regulatory framework in order to compel such institutions to provide this data. The FFDC would also be faced with significant challenge of developing, maintaining and providing the reference data, identification codes and reporting standards for financial institutions to use when providing the data. This would entail the maintenance of a classification system for financial products, additional to those used by all other players in the market.

The FFDC is therefore similar in practical implementation terms to the ECB’s ambitions for the reference data world, but the FFRAC is something different entirely. It is one step further into the reference data space and would involve providing independent analytical capabilities and computing resources to the regulatory community in order to facilitate turning the data into something useful. This is something that the vendor community has been engaged in for some time for the private sector and may therefore view as a potential competitive threat.

According to Mendelowitz, the FFRAC would support the development of metrics and systems to monitor and report on systemic risk levels and patterns. It would conduct, coordinate and sponsor research to support regulatory policy and continuously improve systemic risk regulation and then incorporate the appropriate results into production level risk management tools.

The FFRAC would also dabble in the vendor agenda by maintaining and promulgating standards for software verification and validation for software used by a financial entity for reporting valuations, risk, or results from stress tests. This is, again, a highly sensitive area that may provoke a backlash from the data provider community. Regulators would be using the FFRAC as a testing tool for software offerings, in terms of valuations, risk management and stress testing software.

The addition of another federal agency into a regulatory environment that already has a surplus, and the problem of frequent turf wars as a result, may also prove controversial. For now, Reference Data Review will be closely monitoring the industry response to the proposals (along with the ECB’s proposals). If you have an opinion on the NIF’s analytics ideas in particular, drop us a line.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to leverage Generative AI and Large Language Models for regulatory compliance

Generative AI (GenAI) and Large Language Models (LLMs) offer huge potential for change across capital markets, not least in regulatory compliance where they have the capability to help firms understand and interpret regulations, automate compliance, monitor transactions in real time, and flag anomalies in the same timeframe. They also present challenges including explainability, responsibility, model...

BLOG

J.P. Morgan Adds Containerised Data for Institutional Investors to Fusion Data Mesh

J.P. Morgan has added Containerized Data for institutional investors to its Fusion data mesh. The solution normalises data across multiple providers, sources, types and structures and provides consistent and enriched data across business services. Fusion is a cloud-native data technology platform that provides data management, analytics and reporting, and is built on J.P. Morgan’s global...

EVENT

TradingTech Summit London

Now in its 13th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Institutional Digital Assets Handbook 2023

After initial hesitancy, interest in digital assets from institutional market participants has grown over the past three to four years. Early focus inevitably centred on the market opportunities presented by bitcoin and other cryptocurrencies. But this has evolved into a broad acceptance of a potentially meaningful role for digital assets in institutional markets. It’s now...