About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

NIF Off to a Slow Start with Only 44 Signatures: Is the US Convinced it’s the Answer to Reference Data Standardisation?

Subscribe to our newsletter

As the industry discussions continue around the European Central Bank’s proposed reference data utility, across the pond another similar campaign is attempting to drum up support for its own data standardisation initiative. The Committee to Establish the National Institute of Finance is canvassing for industry participants to sign its petition to launch (you’ve guessed it) the National Institute of Finance (NIF), which it says will maintain a national repository of financial transaction and entity position data.

The petition has 44 signatures so far and most individuals on the list seem to be rather reticent to cite their job titles and institution names. Of those that are visible, the academic community dominates, with a smattering of signatures from the vendor community, including SunGard, GoldenSource, Datagenic and CME Group, and the banking community, including Standard Bank of South Africa, Morgan Stanley and Bank of New York Mellon. Not the most auspicious start, but it is early days and a lot of the campaign work is still to come.

The EDM Council has been a champion of the NIF proposals in the same vein as its support for the ECB’s reference data utility ideas. The US initiative goes one step further than the ECB’s ambitions, however, because the NIF is seeking to provide the analytical capabilities to the market to be able to deal with the data it is providing.

A key proponent of the proposals is Allan Mendelowitz, a director at the Federal Housing Finance Board and a founding member of the Committee to Establish the NIF, who first tabled the notion at a symposium on systemic risk in Washington in June. He explains that the initiative is aimed at tackling the data inefficiencies and gaps in the market that prevent regulators and policymakers from implementing regulatory reform. Mendelowitz says that a “serious and detailed examination of the significant gaps in the informational infrastructure” available to these bodies is needed and contends that the NIF is the appropriate body to do this.

“The NIF would have the authority to gather appropriate data and provide the analytical capabilities necessary to monitor systemic risk, to perform independent risk assessments of individual financial entities, and to provide advice on the financial system to the Federal regulatory agencies and the United States Congress. The NIF would be a government entity and resource for the regulatory community and would not overlap or duplicate existing or proposed oversight and enforcement responsibilities of existing regulatory agencies,” he explains.

The body would be funded via “non-appropriated funds by assessments placed on data reporting entities” and would comprise of two components: a Federal Financial Data Centre (FFDC) and a Federal Financial Research and Analysis Centre (FFRAC). The FFDC would collect, clean, maintain and secure data including financial transactions data, positions, holdings, obligations and any other data deemed important for systemic analysis. All US-based financial entities would be required to report these data to the FFDC and other entities would be required to report such data for their US-based activities.

Accordingly, the NIF would need to be supported by a regulatory framework in order to compel such institutions to provide this data. The FFDC would also be faced with significant challenge of developing, maintaining and providing the reference data, identification codes and reporting standards for financial institutions to use when providing the data. This would entail the maintenance of a classification system for financial products, additional to those used by all other players in the market.

The FFDC is therefore similar in practical implementation terms to the ECB’s ambitions for the reference data world, but the FFRAC is something different entirely. It is one step further into the reference data space and would involve providing independent analytical capabilities and computing resources to the regulatory community in order to facilitate turning the data into something useful. This is something that the vendor community has been engaged in for some time for the private sector and may therefore view as a potential competitive threat.

According to Mendelowitz, the FFRAC would support the development of metrics and systems to monitor and report on systemic risk levels and patterns. It would conduct, coordinate and sponsor research to support regulatory policy and continuously improve systemic risk regulation and then incorporate the appropriate results into production level risk management tools.

The FFRAC would also dabble in the vendor agenda by maintaining and promulgating standards for software verification and validation for software used by a financial entity for reporting valuations, risk, or results from stress tests. This is, again, a highly sensitive area that may provoke a backlash from the data provider community. Regulators would be using the FFRAC as a testing tool for software offerings, in terms of valuations, risk management and stress testing software.

The addition of another federal agency into a regulatory environment that already has a surplus, and the problem of frequent turf wars as a result, may also prove controversial. For now, Reference Data Review will be closely monitoring the industry response to the proposals (along with the ECB’s proposals). If you have an opinion on the NIF’s analytics ideas in particular, drop us a line.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Are you making the most of the business-critical structured data stored in your mainframes?

Fewer than 30% of companies think that they can fully tap into their mainframe data even though complete, accurate and real-time data is key to business decision-making, compliance, modernisation and innovation. For many in financial markets, integrating data across the enterprise and making it available and actionable to everyone who needs it is extremely difficult....

BLOG

AI in Focus as Experts Meet in UK Capital for Data Management Summit London

Artificial intelligence has dominated the data management conversation in the past couple of years as organisations have recognised the technology’s potential to streamline operations, improve decision making and draw value from the data they use. A-Team Group has responded to the growing demand for intelligence on AI and has given the technology a keen focus...

EVENT

TradingTech Summit MENA

The inaugural TradingTech Summit MENA takes place in November and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions in the region.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...