About a-team Marketing Services

A-Team Insight Blogs

Osney Event: Buy-Side Discuss Reference Data Resource, ROI

Subscribe to our newsletter

The resources of the buy-side often can’t compare with those of the
sell-side when it comes to reference data management and measuring the cost impact of bad data in order to justify data management projects. This was one of the messages from a recent Technology Solutions for Asset Management (TSAM) conference organized by Osney Media.

Also discussed by the user panel at the data management stream of the event were the benefits of using third-party data from vendors compared with internal custom data and the related issue of who should accept liability for bad data.

Simon Leighton-Porter, formerly of Citigroup, outlined how Citigroup had built a model to generate cost figures related to data management down to the instrument level. This model took into account business workflow, for example, tracking a fax sent to a person, who then had to re-key the data, and all other activities where bad reference data could impact the processing of securities transactions. The model was then capable of generating ‘guesstimates’ of the cost associated with any errors.

This model was then extended into the front office, where a comparison could be made between the number of basis points gained on, say, an individual Greek trade, and the cost of processing that trade to highlight where money is really being made and lost. Said Leighton-Porter, “This helps when you’re going to talk to the boss about projects.” The model, however, took between four and five months to develop.

This is not always easy for firms, particularly buy-side institutions, to put in place. Terri Humphreys, head of market activities at Baring Asset Management, responded by saying, “I don’t have the luxury of having the same level of resources as many brokers do to spend on such efforts. My real driver is the day-to-day processes that have to happen. I think the challenge is to get the information right before hand so you don’t have to spend time afterwards analysing what went wrong.

Merrill Lynch’s director of client services Geoff Mizen said his company had tried to do the same thing, but “it was horribly complicated so we gave up. I guess the key is getting the right data model.”

An audience member suggested an approach that analyses hard-core examples. It took his team several weeks, he said, but they collected exposure data and conducted real “root cause” analysis. Leighton-Porter suggested that while good analysis for big events, such as the collapse of Enron or similar, “it is the multitude of little failures that is killing us.”

David Hirschfeld, most recently with hedge fund Citadel Associates before joining Asset Control (RDR, May 2005), said metrics are very important. “If you can’t measure it, you can’t improve it.”

With or without measurements of data quality and its impact on the business, the importance of reference data is still clear. As Humphreys said, “Using bad data is like running an unleaded car on leaded fuel.” Leighton-Porter said, “Data has the potential to be a competitive differentiator. If it’s not right, you’re doomed to failure.”

The rise of derivatives and other structured products is going to cause more of a problem, as the related data is more complex than the data most are still struggling with now, suggested Humphreys.

Getting a lot of the data right could be addressed if the financial institutions worked together, she continued. “We could share the public domain information, particularly for KYC as there’s little competitive advantage in doing it ourselves, and it could save us paying lots of money to data vendors.”
Discussions on third-party data raised the issue of liability. Put bluntly, Juergen Stahl, head of business support investment at German asset manager Union IT Services, said, “If the data from a vendor is wrong then I want my money back.” Most audience members in a roundtable discussion, however, agreed there was no possibility of getting liability coverage to any adequate level from the vendors. The responsibility will remain ultimately with the financial institution.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Proactive RegTech approaches to fighting financial crime

Financial crime is a global problem that costs the economy trillions of dollars a year, despite best efforts by financial services firms, regulators, and governments to stem the flow. As criminals become more sophisticated in how they commit financial crime, so too must capital markets participants working to challenge criminality and secure the global financial...

BLOG

Fenergo Adds Client Lifecycle Solution to AWS Marketplace

Fenergo, a provider of digital solutions for Know Your Customer (KYC), transaction monitoring and Client Lifecycle Management (CLM) has made its software-as-a-service (SaaS) CLM solution available on Amazon Web Services (AWS) Marketplace. AWS customers can streamline the procurement and purchase of Fenergo’s CLM directly within an AWS Marketplace account and unlock any AWS incentives, discounts,...

EVENT

Buy AND Build: The Future of Capital Markets Technology, London

Buy AND Build: The Future of Capital Markets Technology London on September 19th at Marriott Hotel Canary Wharf London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...