About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Osney Event: Buy-Side Discuss Reference Data Resource, ROI

Subscribe to our newsletter

The resources of the buy-side often can’t compare with those of the
sell-side when it comes to reference data management and measuring the cost impact of bad data in order to justify data management projects. This was one of the messages from a recent Technology Solutions for Asset Management (TSAM) conference organized by Osney Media.

Also discussed by the user panel at the data management stream of the event were the benefits of using third-party data from vendors compared with internal custom data and the related issue of who should accept liability for bad data.

Simon Leighton-Porter, formerly of Citigroup, outlined how Citigroup had built a model to generate cost figures related to data management down to the instrument level. This model took into account business workflow, for example, tracking a fax sent to a person, who then had to re-key the data, and all other activities where bad reference data could impact the processing of securities transactions. The model was then capable of generating ‘guesstimates’ of the cost associated with any errors.

This model was then extended into the front office, where a comparison could be made between the number of basis points gained on, say, an individual Greek trade, and the cost of processing that trade to highlight where money is really being made and lost. Said Leighton-Porter, “This helps when you’re going to talk to the boss about projects.” The model, however, took between four and five months to develop.

This is not always easy for firms, particularly buy-side institutions, to put in place. Terri Humphreys, head of market activities at Baring Asset Management, responded by saying, “I don’t have the luxury of having the same level of resources as many brokers do to spend on such efforts. My real driver is the day-to-day processes that have to happen. I think the challenge is to get the information right before hand so you don’t have to spend time afterwards analysing what went wrong.

Merrill Lynch’s director of client services Geoff Mizen said his company had tried to do the same thing, but “it was horribly complicated so we gave up. I guess the key is getting the right data model.”

An audience member suggested an approach that analyses hard-core examples. It took his team several weeks, he said, but they collected exposure data and conducted real “root cause” analysis. Leighton-Porter suggested that while good analysis for big events, such as the collapse of Enron or similar, “it is the multitude of little failures that is killing us.”

David Hirschfeld, most recently with hedge fund Citadel Associates before joining Asset Control (RDR, May 2005), said metrics are very important. “If you can’t measure it, you can’t improve it.”

With or without measurements of data quality and its impact on the business, the importance of reference data is still clear. As Humphreys said, “Using bad data is like running an unleaded car on leaded fuel.” Leighton-Porter said, “Data has the potential to be a competitive differentiator. If it’s not right, you’re doomed to failure.”

The rise of derivatives and other structured products is going to cause more of a problem, as the related data is more complex than the data most are still struggling with now, suggested Humphreys.

Getting a lot of the data right could be addressed if the financial institutions worked together, she continued. “We could share the public domain information, particularly for KYC as there’s little competitive advantage in doing it ourselves, and it could save us paying lots of money to data vendors.”
Discussions on third-party data raised the issue of liability. Put bluntly, Juergen Stahl, head of business support investment at German asset manager Union IT Services, said, “If the data from a vendor is wrong then I want my money back.” Most audience members in a roundtable discussion, however, agreed there was no possibility of getting liability coverage to any adequate level from the vendors. The responsibility will remain ultimately with the financial institution.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Adding value and improving efficiencies in sanctions screening

Sanctions have been headline news this year. They are growing in number, sanctions lists are changing on a daily basis, and there can be conflict between sanctions issued by different jurisdictions – the whole calling for financial institutions to optimise sanctions screening to reduce risk and avoid potentially punitive penalties of non-compliance. This webinar will...

BLOG

Toward a Modern Data Fabric for Capital Markets

By Gary West, General Manager, EMEA, Yellowbrick. Every financial institution I talk to is re-considering some aspect of their data infrastructure. Renewal is of course an ever-present feature of the financial technology landscape with firms racing to compete. However, current conversations feel different. When it comes to data, the subject turns quickly away from discrete...

EVENT

Data Management Summit London

Now in its 13th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore the evolution of data strategy and how to leverage data to drive compliance and business insight.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...