About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Decentralized Data May Cost Firms $600 Million Annually

Subscribe to our newsletter

Decentralized reference data operations and the effects of faulty data on operational risk and capital requirements could be costing financial services institutions an astonishing $600 million each annually, according to the findings of a recent research paper.

The paper – titled Operational Risk and Reference Data: Exploring Costs, Capital Requirements and Risk Mitigation – claims for the first time to attempt to explore scientifically the costs of reference data and effects of faulty data on operational risk and capital, especially under upcoming Basel II requirements. The authors claim others have attempted to quantify these things but usually only in an anecdotally documented fashion.

The paper’s authors are: Allan Grody, president, Financial InterGroup; Fotios Harmantzis, PhD and assistant professor, School of Technology Management, Stevens Institute of Technology; and Gregory Kaple, senior partner, Integrated Management Services Inc.

Among the paper’s conclusions is that reference data systems are emerging as a financial application platform concept, distinct from the application logic that supports individual business processes across a financial enterprise. The authors also point out that being better at transactional reference data management has no strategic value and assert that, since faulty reference data makes up the core of key operations losses, an industry-wide initiative is the only way forward. The paper suggests the industry explores the concept of an industry utility to match and “clear” reference data.

Although the paper includes detailed historical sections on both reference data and risk, oddly, it makes no mention of the failed Global Straight-Through Processing Association (GSTPA), which raised more than EUR 90 million to develop the Transaction Flow Manager (TFM). The central matching concept required a broad community of users for it to become a viable solution – something the GSTPA was unable to secure before it ceased operations just two months after launching the TFM.

Although central matching initially seemed to be a promising concept, it required a ‘big bang’ approach and a degree of cooperative engagement previously unseen and clearly not supported within the industry. After getting their collective fingers burned by the abandonment of the ambitious project, industry players generally run from any suggestion to revisit this type of idea.

Weighing in at 88 pages, this paper provides an excellent primer on both reference data and the interaction of data in risk management under Basel II, including a useful historical context, albeit one with a U.S. listed securities bias.

With one of the principal authors a former member of BASIC (Banking and Securities Industry Committee), the committee formed in the aftermath of the New York Stock Exchange’s ‘paper crisis’ in 1968 and which championed the CUSIP numbering system, it is little wonder that the paper has a U.S. perspective.

What is unique about the paper is its attempt to provide a business case for firms looking to justify investment in reference data systems. The authors feel the lack of a good business case is “thwarting the recognition of the pervasive nature of the issues and the cost and risk associated with reference data.”

A detailed methodology is used to estimate the direct cost of reference data (people, facilities, data, licenses, etc.), losses (fails and corporate actions) and capital costs.

In the case of capital costs, the authors created a “very preliminary calculation for operational risk capital associated with faulty reference data under the Basel guidelines” for the 15 largest U.S.-based firms.

And the big number is anywhere from $266 to $600 million per firm per year. The paper details its methodology for companies that wish to do their own calculations.

Where the paper seems to stumble a bit is in its oversimplified approach to a solution for the industry. As noted above, a central matching utility was tried before and failed.

Although this still may be the best solution forward for reference data, not examining causes of previous failures or addressing industry concerns about this approach (e.g. who takes responsibility for financial losses?) unfairly paints the authors as slightly naïve.  

Their technical argument supporting a central utility similarly seems to lack depth.

The authors assert that intelligent ‘content-enabled’ multicast networks using XML can be implemented to easily route reference data.

While these solutions may be appropriate for retail transactions processing ‘WalMart style’ – one example the authors provide – the reality of global financial processing is somewhat more complicated. The U.S. retail goods sector doesn’t necessary have the same concerns about ever-increasing data volumes, network latency and a morass of securities identifiers or event data, for a start. Or does it? The authors don’t tell us.
Despite these minor points, the paper is a must-read for anyone with responsibilities encompassing reference data. The well-researched report facilitates a “read what you need approach” – with background information provided as separate chapters.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to develop a reporting framework for ESG disclosure regulation

ESG reporting is a challenge and additional burden for many financial institutions as regulations continue to evolve, ESG data management is complex, and global standards remain elusive. Helpful solutions include reporting frameworks that support the collection, understanding, and management of ESG data for disclosure. This webinar will provide practical guidance on how to build a...

BLOG

SIX responds to client demand with automated Corporate Action Calendar

SIX is helping clients track and process upcoming corporate action events cost effectively with an automated Corporate Action Calendar. Corporate actions covered by the calendar include mergers and acquisitions, dividends, and stock buybacks across all portfolio companies. The calendar, which is displayed through SIX iD, the company’s intuitive data display tool, is a response to...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...