About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Osney Event: Buy-Side Discuss Reference Data Resource, ROI

Subscribe to our newsletter

The resources of the buy-side often can’t compare with those of the
sell-side when it comes to reference data management and measuring the cost impact of bad data in order to justify data management projects. This was one of the messages from a recent Technology Solutions for Asset Management (TSAM) conference organized by Osney Media.

Also discussed by the user panel at the data management stream of the event were the benefits of using third-party data from vendors compared with internal custom data and the related issue of who should accept liability for bad data.

Simon Leighton-Porter, formerly of Citigroup, outlined how Citigroup had built a model to generate cost figures related to data management down to the instrument level. This model took into account business workflow, for example, tracking a fax sent to a person, who then had to re-key the data, and all other activities where bad reference data could impact the processing of securities transactions. The model was then capable of generating ‘guesstimates’ of the cost associated with any errors.

This model was then extended into the front office, where a comparison could be made between the number of basis points gained on, say, an individual Greek trade, and the cost of processing that trade to highlight where money is really being made and lost. Said Leighton-Porter, “This helps when you’re going to talk to the boss about projects.” The model, however, took between four and five months to develop.

This is not always easy for firms, particularly buy-side institutions, to put in place. Terri Humphreys, head of market activities at Baring Asset Management, responded by saying, “I don’t have the luxury of having the same level of resources as many brokers do to spend on such efforts. My real driver is the day-to-day processes that have to happen. I think the challenge is to get the information right before hand so you don’t have to spend time afterwards analysing what went wrong.

Merrill Lynch’s director of client services Geoff Mizen said his company had tried to do the same thing, but “it was horribly complicated so we gave up. I guess the key is getting the right data model.”

An audience member suggested an approach that analyses hard-core examples. It took his team several weeks, he said, but they collected exposure data and conducted real “root cause” analysis. Leighton-Porter suggested that while good analysis for big events, such as the collapse of Enron or similar, “it is the multitude of little failures that is killing us.”

David Hirschfeld, most recently with hedge fund Citadel Associates before joining Asset Control (RDR, May 2005), said metrics are very important. “If you can’t measure it, you can’t improve it.”

With or without measurements of data quality and its impact on the business, the importance of reference data is still clear. As Humphreys said, “Using bad data is like running an unleaded car on leaded fuel.” Leighton-Porter said, “Data has the potential to be a competitive differentiator. If it’s not right, you’re doomed to failure.”

The rise of derivatives and other structured products is going to cause more of a problem, as the related data is more complex than the data most are still struggling with now, suggested Humphreys.

Getting a lot of the data right could be addressed if the financial institutions worked together, she continued. “We could share the public domain information, particularly for KYC as there’s little competitive advantage in doing it ourselves, and it could save us paying lots of money to data vendors.”
Discussions on third-party data raised the issue of liability. Put bluntly, Juergen Stahl, head of business support investment at German asset manager Union IT Services, said, “If the data from a vendor is wrong then I want my money back.” Most audience members in a roundtable discussion, however, agreed there was no possibility of getting liability coverage to any adequate level from the vendors. The responsibility will remain ultimately with the financial institution.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking Transparency in Private Markets: Data-Driven Strategies in Asset Management

As asset managers continue to increase their allocations in private assets, the demand for greater transparency, risk oversight, and operational efficiency is growing rapidly. Managing private markets data presents its own set of unique challenges due to a lack of transparency, disparate sources and lack of standardization. Without reliable access, your firm may face inefficiencies,...

BLOG

11 Providers Shaping the Capital Markets Data Governance Landscape

The vast volumes of data that capital markets participants are ingesting as a matter of necessity have placed new demands on their data estates. At a time of market volatility, increased regulatory scrutiny and growing requirements for real-time insights, keeping control of how their data is ingested, distributed and utilised has become a growing challenge....

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...