About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Decentralized Data May Cost Firms $600 Million Annually

Subscribe to our newsletter

Decentralized reference data operations and the effects of faulty data on operational risk and capital requirements could be costing financial services institutions an astonishing $600 million each annually, according to the findings of a recent research paper.

The paper – titled Operational Risk and Reference Data: Exploring Costs, Capital Requirements and Risk Mitigation – claims for the first time to attempt to explore scientifically the costs of reference data and effects of faulty data on operational risk and capital, especially under upcoming Basel II requirements. The authors claim others have attempted to quantify these things but usually only in an anecdotally documented fashion.

The paper’s authors are: Allan Grody, president, Financial InterGroup; Fotios Harmantzis, PhD and assistant professor, School of Technology Management, Stevens Institute of Technology; and Gregory Kaple, senior partner, Integrated Management Services Inc.

Among the paper’s conclusions is that reference data systems are emerging as a financial application platform concept, distinct from the application logic that supports individual business processes across a financial enterprise. The authors also point out that being better at transactional reference data management has no strategic value and assert that, since faulty reference data makes up the core of key operations losses, an industry-wide initiative is the only way forward. The paper suggests the industry explores the concept of an industry utility to match and “clear” reference data.

Although the paper includes detailed historical sections on both reference data and risk, oddly, it makes no mention of the failed Global Straight-Through Processing Association (GSTPA), which raised more than EUR 90 million to develop the Transaction Flow Manager (TFM). The central matching concept required a broad community of users for it to become a viable solution – something the GSTPA was unable to secure before it ceased operations just two months after launching the TFM.

Although central matching initially seemed to be a promising concept, it required a ‘big bang’ approach and a degree of cooperative engagement previously unseen and clearly not supported within the industry. After getting their collective fingers burned by the abandonment of the ambitious project, industry players generally run from any suggestion to revisit this type of idea.

Weighing in at 88 pages, this paper provides an excellent primer on both reference data and the interaction of data in risk management under Basel II, including a useful historical context, albeit one with a U.S. listed securities bias.

With one of the principal authors a former member of BASIC (Banking and Securities Industry Committee), the committee formed in the aftermath of the New York Stock Exchange’s ‘paper crisis’ in 1968 and which championed the CUSIP numbering system, it is little wonder that the paper has a U.S. perspective.

What is unique about the paper is its attempt to provide a business case for firms looking to justify investment in reference data systems. The authors feel the lack of a good business case is “thwarting the recognition of the pervasive nature of the issues and the cost and risk associated with reference data.”

A detailed methodology is used to estimate the direct cost of reference data (people, facilities, data, licenses, etc.), losses (fails and corporate actions) and capital costs.

In the case of capital costs, the authors created a “very preliminary calculation for operational risk capital associated with faulty reference data under the Basel guidelines” for the 15 largest U.S.-based firms.

And the big number is anywhere from $266 to $600 million per firm per year. The paper details its methodology for companies that wish to do their own calculations.

Where the paper seems to stumble a bit is in its oversimplified approach to a solution for the industry. As noted above, a central matching utility was tried before and failed.

Although this still may be the best solution forward for reference data, not examining causes of previous failures or addressing industry concerns about this approach (e.g. who takes responsibility for financial losses?) unfairly paints the authors as slightly naïve.  

Their technical argument supporting a central utility similarly seems to lack depth.

The authors assert that intelligent ‘content-enabled’ multicast networks using XML can be implemented to easily route reference data.

While these solutions may be appropriate for retail transactions processing ‘WalMart style’ – one example the authors provide – the reality of global financial processing is somewhat more complicated. The U.S. retail goods sector doesn’t necessary have the same concerns about ever-increasing data volumes, network latency and a morass of securities identifiers or event data, for a start. Or does it? The authors don’t tell us.
Despite these minor points, the paper is a must-read for anyone with responsibilities encompassing reference data. The well-researched report facilitates a “read what you need approach” – with background information provided as separate chapters.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practices for buy-side data management across structured and unstructured data

Data management is central to asset management, but it can also be a challenge as firms face increased volumes of data, data complexity and the need to consolidate structured and unstructured data to gain valuable insights, improve decision-making, step up customer acquisition and compliance, and ultimately, gain competitive advantage in a market characterised by tight...

BLOG

Data Hurdles, Expertise Loss Hampering BCBS 239 Compliance

A combination of data management hurdles, talent shortages and poor succession planning are bedevilling banks as they struggle to respond to a long-standing risk data aggregation directive – even after repeated European Central Bank (ECB) criticism of their compliance shortfalls. The Basel Committee on Banking Supervision’s Principles for effective risk data aggregation and risk reporting...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...