About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Patent Pending Ref Data Utility Offers Risk Mitigation, Seeks Backers

Subscribe to our newsletter

The inventors of an ambitious plan to create a risk mitigating Reference Data Utility for the securities industry are entering a new phase of marketing for their business venture in the belief that financial institutions can be galvanised by the lure of reduced capital and collateral requirements into solving once and for all the age-old problem of faulty reference data. This collaborative model would also eliminate duplicate spending on data by all the firms in the marketplace, they reckon.

The partners in the venture – Allan Grody of Financial Intergroup, Richard Tinervin, a former managing director of Citigroup, and Pat Tsien, ex of Accenture, where she was responsible for its data management and reference data services – have been touting their proposition in the marketplace for some time. Now that their pending patent for the Reference Data Utility is public (and viewable at http://appft1.uspto.gov/netacgi/nph) they are “moving their private negotiations into the mainstream” and entering a phase of marketing in the hope of securing backing from financial institutions and existing industry utilities. Though they have yet to secure any “pillar partners”, their proposition is in front of a number of entities now and they say they are confident of moving forward in a relatively short timeframe. They are seeking capital input from early backers who will then benefit financially when the utility is sold to the marketplace in a three to five-year timeframe.

According to the pending patent, the Reference Data Utility comprises “a method and apparatus for matching, clearing and settling financial transaction reference data”. This includes data communication devices, data storage devices and a data processing system for “receiving, identifying, selecting, matching, standardising, organising and distributing financial services reference data while mitigating the risk of faulty reference data within the global financial payment and collateral matching mechanism”. The sources of reference data which the utility will match, clear and settle are envisaged as including “exchanges, clearing and depository facilities, vendors, aggregators, national numbering associations, broker/dealers, asset managers and custodians”. The focus of the utility is on instrument, entity and corporate actions data – plus end of day valuation prices. Real-time prices are excluded.

The basic contention is that “the costs to support a non-strategic process are duplicative across the industry, and are getting out of hand”, says Grody. “The big firms are spending a billion dollars plus on something that should have been commonly resolved.” But the partners recognise that because the costs of managing data are hidden within operations and are not sufficiently visible to the C-suite, the prospect of solving the reference data problem in and for itself alone is not a powerful enough proposition.

Put “a different lens on the issue” – risk reduction – and an initiative becomes much more compelling, reckons Tsien. “Data management is an age-old problem that everyone acknowledges, but to get budget to really solve the problem it needs higher-up recognition,” she says. “Today it sits in silos in the organisation. To solve it you need the involvement of the CEO who sits over all the silos. By repositioning the problem as a risk management problem related to Basel II et cetera, you begin to change the focus – not of the grassroots problem which we all acknowledge – but on to the results of high quality data, such as reducing risk and reducing capital requirements.”

The partners believe that in the same way as the industry has created mutualised risk sharing entities for the clearing and settlement of transactions – such as DTCC, Euroclear and Clearstream – there is a need for another risk mitigating utility to match and clear reference data. “We are proposing that the bigger firms in the industry come together to create an Exempt Clearing Corporation in the US – and the equivalents in Europe and Asia – to do what the G30 has said the industry needs to do and create a global owner of the reference data problem in order to solve it,” says Grody.

Risk mitigation will be achieved through the utility in a similar way to how it is achieved through clearing corporations and depositories, he reckons. “Only the big guys with strong balance sheets join these directly, and they offer wholesale services to the smaller firms. The risk mitigation comes as a result of the capital put up by the big organisations. That’s how they guarantee each other. The idea of the Reference Data Utility is that once data is in and cleansed from all the vendors – and the facility can buy all the vendors’ data – a stream of financial data comes out that all firms use. There will then be commonality of product code, business entity, corporate actions and valuation processes, and risk mitigation is achieved as each firm guarantees each other.”

According to Tinervin, the partners have created and are putting into action a detailed business plan for the initiative. “We are refining who are the most obvious candidates to be our ‘pillar partners’, and what the value proposition is for them,” he says. “We are talking with a number of these institutions. We are sensitive to their own cultures and attitudes towards collaboration. We have a good fix in the US and in Europe on who the most likely candidates are.” The partners are also focusing on how to evolve the initiative into a utility – what will be its functions, its structure, how operational risk mitigation will be assured, and how business risk will be handled in terms of ensuring the utility can be a successful business enterprise.

This involves having a realistic expectation in terms of capital support from partners and determining how many institutions are likely to support the venture over what period of time, he says.

The early stage founders will benefit from a different business model to institutions that join the party later. “We have had overtures of finance from private equity firms, but we have not yet indulged because the biggest organisations can fund this themselves,” Grody reckons. “The entity will be commercial, so the early partners will participate in a liquidity event three to five years out when we sell the utility to the rest of the industry. The early adopters benefit from this as a business venture – and will fund it by a reduction in the cost of their own data cleansing and downstream processing.”

The partners report that of the potential pillar partners to which they have talked to date, “no-one says it is a bad idea – but they all say they will be second”. For this reason, they are “reaching out” to organisations with collaborative undertakings already – and those that have started to centralise their reference data management internally and have begun to see the benefits of reducing duplication of data spend. “Most firms don’t see data as a differentiating activity,” says Grody. “But some organisations don’t have harmony internally – which is why we’re reaching up to the CEO level, because they’re the ones that own the billion dollar issue.”

Of the existing collaborative initiatives in the data management world, Grody says “we are talking to all of them, but if each is successful in its own right and left to its own devices, we will be left with the same problem as today”. He adds though that the utility is not about putting anyone out of business. “The data vendors recognise that their value proposition has changed dramatically,” he says, and that where some data is becoming commoditised, they have the opportunity to solve new problems – such as the reference data needed for structured products.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Proactive RegTech approaches to fighting financial crime

Financial crime is a global problem that costs the economy trillions of dollars a year, despite best efforts by financial services firms, regulators, and governments to stem the flow. As criminals become more sophisticated in how they commit financial crime, so too must capital markets participants working to challenge criminality and secure the global financial...

BLOG

Moody’s Gen AI Research Assistant Offers New Insights from Research, Data and Analytics

Moody’s has released Moody’s Research Assistant, a search and analytical tool powered by generative AI and using the company’s proprietary content and the latest large language models (LLMs) to help customers generate new insights from its credit research, data, and analytics. The research assistant synthesises vast amounts of information allowing users to assess lending or...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...