The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Decentralized Data May Cost Firms $600 Million Annually

Decentralized reference data operations and the effects of faulty data on operational risk and capital requirements could be costing financial services institutions an astonishing $600 million each annually, according to the findings of a recent research paper.

The paper – titled Operational Risk and Reference Data: Exploring Costs, Capital Requirements and Risk Mitigation – claims for the first time to attempt to explore scientifically the costs of reference data and effects of faulty data on operational risk and capital, especially under upcoming Basel II requirements. The authors claim others have attempted to quantify these things but usually only in an anecdotally documented fashion.

The paper’s authors are: Allan Grody, president, Financial InterGroup; Fotios Harmantzis, PhD and assistant professor, School of Technology Management, Stevens Institute of Technology; and Gregory Kaple, senior partner, Integrated Management Services Inc.

Among the paper’s conclusions is that reference data systems are emerging as a financial application platform concept, distinct from the application logic that supports individual business processes across a financial enterprise. The authors also point out that being better at transactional reference data management has no strategic value and assert that, since faulty reference data makes up the core of key operations losses, an industry-wide initiative is the only way forward. The paper suggests the industry explores the concept of an industry utility to match and “clear” reference data.

Although the paper includes detailed historical sections on both reference data and risk, oddly, it makes no mention of the failed Global Straight-Through Processing Association (GSTPA), which raised more than EUR 90 million to develop the Transaction Flow Manager (TFM). The central matching concept required a broad community of users for it to become a viable solution – something the GSTPA was unable to secure before it ceased operations just two months after launching the TFM.

Although central matching initially seemed to be a promising concept, it required a ‘big bang’ approach and a degree of cooperative engagement previously unseen and clearly not supported within the industry. After getting their collective fingers burned by the abandonment of the ambitious project, industry players generally run from any suggestion to revisit this type of idea.

Weighing in at 88 pages, this paper provides an excellent primer on both reference data and the interaction of data in risk management under Basel II, including a useful historical context, albeit one with a U.S. listed securities bias.

With one of the principal authors a former member of BASIC (Banking and Securities Industry Committee), the committee formed in the aftermath of the New York Stock Exchange’s ‘paper crisis’ in 1968 and which championed the CUSIP numbering system, it is little wonder that the paper has a U.S. perspective.

What is unique about the paper is its attempt to provide a business case for firms looking to justify investment in reference data systems. The authors feel the lack of a good business case is “thwarting the recognition of the pervasive nature of the issues and the cost and risk associated with reference data.”

A detailed methodology is used to estimate the direct cost of reference data (people, facilities, data, licenses, etc.), losses (fails and corporate actions) and capital costs.

In the case of capital costs, the authors created a “very preliminary calculation for operational risk capital associated with faulty reference data under the Basel guidelines” for the 15 largest U.S.-based firms.

And the big number is anywhere from $266 to $600 million per firm per year. The paper details its methodology for companies that wish to do their own calculations.

Where the paper seems to stumble a bit is in its oversimplified approach to a solution for the industry. As noted above, a central matching utility was tried before and failed.

Although this still may be the best solution forward for reference data, not examining causes of previous failures or addressing industry concerns about this approach (e.g. who takes responsibility for financial losses?) unfairly paints the authors as slightly naïve.  

Their technical argument supporting a central utility similarly seems to lack depth.

The authors assert that intelligent ‘content-enabled’ multicast networks using XML can be implemented to easily route reference data.

While these solutions may be appropriate for retail transactions processing ‘WalMart style’ – one example the authors provide – the reality of global financial processing is somewhat more complicated. The U.S. retail goods sector doesn’t necessary have the same concerns about ever-increasing data volumes, network latency and a morass of securities identifiers or event data, for a start. Or does it? The authors don’t tell us.
Despite these minor points, the paper is a must-read for anyone with responsibilities encompassing reference data. The well-researched report facilitates a “read what you need approach” – with background information provided as separate chapters.

Related content

WEBINAR

Upcoming Webinar: Data management for ESG requirements

Date: 13 May 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Environmental, Social and Governance (ESG) investing is moving into the mainstream, requiring asset managers to develop ESG strategies that deliver for both the firm and its investors. While these strategies can outperform those that do not include ESG factors,...

BLOG

The Future of Transatlantic Regulatory Cooperation: Antony Phillipson

As the RegTech Summit Virtual 2020 fast approaches, we are so pleased to welcome Antony Phillipson, Her Majesty’s Trade Commissioner for North America and Consul General in New York on Day Three (November 18) for a special Keynote speech and Q&A to discuss a possible Brexit UK-US FTA and outline the future of transatlantic regulatory...

EVENT

Data Management Summit New York City

Now in its 10th year, the Data Management Summit (DMS) in NYC explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.

GUIDE

RegTech Suppliers Guide 2020/2021

Welcome to the second edition of A-Team Group’s RegTech Suppliers Guide, an essential aid for financial institutions sourcing innovative solutions to improve their regulatory response, and a showcase for encumbent and new RegTech vendors with offerings designed to match market demand. Available free of charge and based on an industry-wide survey, the guide provides a...