About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Osney Event: Buy-Side Discuss Reference Data Resource, ROI

Subscribe to our newsletter

The resources of the buy-side often can’t compare with those of the
sell-side when it comes to reference data management and measuring the cost impact of bad data in order to justify data management projects. This was one of the messages from a recent Technology Solutions for Asset Management (TSAM) conference organized by Osney Media.

Also discussed by the user panel at the data management stream of the event were the benefits of using third-party data from vendors compared with internal custom data and the related issue of who should accept liability for bad data.

Simon Leighton-Porter, formerly of Citigroup, outlined how Citigroup had built a model to generate cost figures related to data management down to the instrument level. This model took into account business workflow, for example, tracking a fax sent to a person, who then had to re-key the data, and all other activities where bad reference data could impact the processing of securities transactions. The model was then capable of generating ‘guesstimates’ of the cost associated with any errors.

This model was then extended into the front office, where a comparison could be made between the number of basis points gained on, say, an individual Greek trade, and the cost of processing that trade to highlight where money is really being made and lost. Said Leighton-Porter, “This helps when you’re going to talk to the boss about projects.” The model, however, took between four and five months to develop.

This is not always easy for firms, particularly buy-side institutions, to put in place. Terri Humphreys, head of market activities at Baring Asset Management, responded by saying, “I don’t have the luxury of having the same level of resources as many brokers do to spend on such efforts. My real driver is the day-to-day processes that have to happen. I think the challenge is to get the information right before hand so you don’t have to spend time afterwards analysing what went wrong.

Merrill Lynch’s director of client services Geoff Mizen said his company had tried to do the same thing, but “it was horribly complicated so we gave up. I guess the key is getting the right data model.”

An audience member suggested an approach that analyses hard-core examples. It took his team several weeks, he said, but they collected exposure data and conducted real “root cause” analysis. Leighton-Porter suggested that while good analysis for big events, such as the collapse of Enron or similar, “it is the multitude of little failures that is killing us.”

David Hirschfeld, most recently with hedge fund Citadel Associates before joining Asset Control (RDR, May 2005), said metrics are very important. “If you can’t measure it, you can’t improve it.”

With or without measurements of data quality and its impact on the business, the importance of reference data is still clear. As Humphreys said, “Using bad data is like running an unleaded car on leaded fuel.” Leighton-Porter said, “Data has the potential to be a competitive differentiator. If it’s not right, you’re doomed to failure.”

The rise of derivatives and other structured products is going to cause more of a problem, as the related data is more complex than the data most are still struggling with now, suggested Humphreys.

Getting a lot of the data right could be addressed if the financial institutions worked together, she continued. “We could share the public domain information, particularly for KYC as there’s little competitive advantage in doing it ourselves, and it could save us paying lots of money to data vendors.”
Discussions on third-party data raised the issue of liability. Put bluntly, Juergen Stahl, head of business support investment at German asset manager Union IT Services, said, “If the data from a vendor is wrong then I want my money back.” Most audience members in a roundtable discussion, however, agreed there was no possibility of getting liability coverage to any adequate level from the vendors. The responsibility will remain ultimately with the financial institution.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: In data we trust – How to ensure high quality data to power AI

Artificial intelligence is increasingly powering financial institutions’ processes and workflows, encompassing all parts of the enterprise from front-office to the back-office. As organisations seek to gain a competitive edge, they are trialling the technology in variety of ways to streamline and empower multiple use cases. Some are further than others along the path to achieving...

BLOG

Implementing Technology Business Management with Pace and Precision

By Simon Mendoza, Chief Technology Officer, Calero. Implementing a Technology Business Management (TBM) platform can feel like a major logistical challenge. Every organisation starts from a different place – different data maturity, internal priorities and levels of stakeholder engagement. But that doesn’t mean every implementation needs to be a blank slate. The fastest and most...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...