About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Decentralized Data May Cost Firms $600 Million Annually

Subscribe to our newsletter

Decentralized reference data operations and the effects of faulty data on operational risk and capital requirements could be costing financial services institutions an astonishing $600 million each annually, according to the findings of a recent research paper.

The paper – titled Operational Risk and Reference Data: Exploring Costs, Capital Requirements and Risk Mitigation – claims for the first time to attempt to explore scientifically the costs of reference data and effects of faulty data on operational risk and capital, especially under upcoming Basel II requirements. The authors claim others have attempted to quantify these things but usually only in an anecdotally documented fashion.

The paper’s authors are: Allan Grody, president, Financial InterGroup; Fotios Harmantzis, PhD and assistant professor, School of Technology Management, Stevens Institute of Technology; and Gregory Kaple, senior partner, Integrated Management Services Inc.

Among the paper’s conclusions is that reference data systems are emerging as a financial application platform concept, distinct from the application logic that supports individual business processes across a financial enterprise. The authors also point out that being better at transactional reference data management has no strategic value and assert that, since faulty reference data makes up the core of key operations losses, an industry-wide initiative is the only way forward. The paper suggests the industry explores the concept of an industry utility to match and “clear” reference data.

Although the paper includes detailed historical sections on both reference data and risk, oddly, it makes no mention of the failed Global Straight-Through Processing Association (GSTPA), which raised more than EUR 90 million to develop the Transaction Flow Manager (TFM). The central matching concept required a broad community of users for it to become a viable solution – something the GSTPA was unable to secure before it ceased operations just two months after launching the TFM.

Although central matching initially seemed to be a promising concept, it required a ‘big bang’ approach and a degree of cooperative engagement previously unseen and clearly not supported within the industry. After getting their collective fingers burned by the abandonment of the ambitious project, industry players generally run from any suggestion to revisit this type of idea.

Weighing in at 88 pages, this paper provides an excellent primer on both reference data and the interaction of data in risk management under Basel II, including a useful historical context, albeit one with a U.S. listed securities bias.

With one of the principal authors a former member of BASIC (Banking and Securities Industry Committee), the committee formed in the aftermath of the New York Stock Exchange’s ‘paper crisis’ in 1968 and which championed the CUSIP numbering system, it is little wonder that the paper has a U.S. perspective.

What is unique about the paper is its attempt to provide a business case for firms looking to justify investment in reference data systems. The authors feel the lack of a good business case is “thwarting the recognition of the pervasive nature of the issues and the cost and risk associated with reference data.”

A detailed methodology is used to estimate the direct cost of reference data (people, facilities, data, licenses, etc.), losses (fails and corporate actions) and capital costs.

In the case of capital costs, the authors created a “very preliminary calculation for operational risk capital associated with faulty reference data under the Basel guidelines” for the 15 largest U.S.-based firms.

And the big number is anywhere from $266 to $600 million per firm per year. The paper details its methodology for companies that wish to do their own calculations.

Where the paper seems to stumble a bit is in its oversimplified approach to a solution for the industry. As noted above, a central matching utility was tried before and failed.

Although this still may be the best solution forward for reference data, not examining causes of previous failures or addressing industry concerns about this approach (e.g. who takes responsibility for financial losses?) unfairly paints the authors as slightly naïve.  

Their technical argument supporting a central utility similarly seems to lack depth.

The authors assert that intelligent ‘content-enabled’ multicast networks using XML can be implemented to easily route reference data.

While these solutions may be appropriate for retail transactions processing ‘WalMart style’ – one example the authors provide – the reality of global financial processing is somewhat more complicated. The U.S. retail goods sector doesn’t necessary have the same concerns about ever-increasing data volumes, network latency and a morass of securities identifiers or event data, for a start. Or does it? The authors don’t tell us.
Despite these minor points, the paper is a must-read for anyone with responsibilities encompassing reference data. The well-researched report facilitates a “read what you need approach” – with background information provided as separate chapters.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and reduce costs. Use cases of specific importance to the finance sector, such as data...

BLOG

Innovative Systems Wins Best Data Solution for Regulatory Compliance Award at A-Team Group’s DMI USA Awards 2025

Innovative Systems has won the award for Best Data Solution for Regulatory Compliance for its FinScan Enhance solution in the Data Management Insight USA Awards 2025. The awards recognise established providers and innovative newcomers who offer solutions that are providing leading data management solutions, services and consultancy to capital markets participants across Europe. Winners are selected...

EVENT

TradingTech Summit London

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...