About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Osney Event: Buy-Side Discuss Reference Data Resource, ROI

Subscribe to our newsletter

The resources of the buy-side often can’t compare with those of the
sell-side when it comes to reference data management and measuring the cost impact of bad data in order to justify data management projects. This was one of the messages from a recent Technology Solutions for Asset Management (TSAM) conference organized by Osney Media.

Also discussed by the user panel at the data management stream of the event were the benefits of using third-party data from vendors compared with internal custom data and the related issue of who should accept liability for bad data.

Simon Leighton-Porter, formerly of Citigroup, outlined how Citigroup had built a model to generate cost figures related to data management down to the instrument level. This model took into account business workflow, for example, tracking a fax sent to a person, who then had to re-key the data, and all other activities where bad reference data could impact the processing of securities transactions. The model was then capable of generating ‘guesstimates’ of the cost associated with any errors.

This model was then extended into the front office, where a comparison could be made between the number of basis points gained on, say, an individual Greek trade, and the cost of processing that trade to highlight where money is really being made and lost. Said Leighton-Porter, “This helps when you’re going to talk to the boss about projects.” The model, however, took between four and five months to develop.

This is not always easy for firms, particularly buy-side institutions, to put in place. Terri Humphreys, head of market activities at Baring Asset Management, responded by saying, “I don’t have the luxury of having the same level of resources as many brokers do to spend on such efforts. My real driver is the day-to-day processes that have to happen. I think the challenge is to get the information right before hand so you don’t have to spend time afterwards analysing what went wrong.

Merrill Lynch’s director of client services Geoff Mizen said his company had tried to do the same thing, but “it was horribly complicated so we gave up. I guess the key is getting the right data model.”

An audience member suggested an approach that analyses hard-core examples. It took his team several weeks, he said, but they collected exposure data and conducted real “root cause” analysis. Leighton-Porter suggested that while good analysis for big events, such as the collapse of Enron or similar, “it is the multitude of little failures that is killing us.”

David Hirschfeld, most recently with hedge fund Citadel Associates before joining Asset Control (RDR, May 2005), said metrics are very important. “If you can’t measure it, you can’t improve it.”

With or without measurements of data quality and its impact on the business, the importance of reference data is still clear. As Humphreys said, “Using bad data is like running an unleaded car on leaded fuel.” Leighton-Porter said, “Data has the potential to be a competitive differentiator. If it’s not right, you’re doomed to failure.”

The rise of derivatives and other structured products is going to cause more of a problem, as the related data is more complex than the data most are still struggling with now, suggested Humphreys.

Getting a lot of the data right could be addressed if the financial institutions worked together, she continued. “We could share the public domain information, particularly for KYC as there’s little competitive advantage in doing it ourselves, and it could save us paying lots of money to data vendors.”
Discussions on third-party data raised the issue of liability. Put bluntly, Juergen Stahl, head of business support investment at German asset manager Union IT Services, said, “If the data from a vendor is wrong then I want my money back.” Most audience members in a roundtable discussion, however, agreed there was no possibility of getting liability coverage to any adequate level from the vendors. The responsibility will remain ultimately with the financial institution.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Solving the operations talent crisis

With financial services in the grip of the Great Resignation, operations – a function which has always found recruitment and retention of talent difficult – is facing challenging times. Business growth is a must, but with scaling comes the cost and complexity of additional headcount. How can you ensure that these constraints don’t hold your...

BLOG

S&P Global Market Intelligence Adds Onboarding Accelerator to Securities Finance Platform

S&P Global Market Intelligence has integrated the Onboarding Accelerator tool into its securities finance platform. The onboarding solution was initially developed by IHS Markit, which S&P Global acquired in a $44 billion merger completed in March 2022. The Onboarding Accelerator is an automated solution designed to challenge the inefficiencies of manual processes used by borrowers...

EVENT

RegTech Summit London

Now in its 6th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2022/2023 – Tenth Edition

Welcome to the tenth edition of A-Team Group’s Regulatory Data Handbook, a publication that has tracked new regulations, amendments, implementation and data management requirements as regulatory change has impacted global capital markets participants over the past 10 years. This edition of the handbook includes new regulations and highlights some of the major regulatory interventions challenging...