About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Understanding Risk is Key to Effective Data Management, Says BNY M’s Cox

Subscribe to our newsletter

Institutions need to invest in automated processes and good quality staff in order to adequately assess their risk exposure within data management process, said BNY Mellon’s Matthew Cox, head of securities data management, EMEA. They also need to understand the internal risks that are posed by bad quality data and examine what vendors can and can’t provide in terms of data accuracy, he explained to the FIMA 2008 delegation.

“Identifying the source of inaccurate data is key to mitigating risk in the current environment,” he explained. “BNY Mellon does this by dual sourcing data and producing golden copy.”

All models have their issues, however, but “common sense” and stepping back from the problem rather than merely reacting to issues that arise is the best approach, he said. “We need to understand what the end client wants to use the data for and whether they pass that data onto another party,” Cox explained. “This is in order to more accurately measure client and financial risk, which must both be taken into account at the end of the day.”

When servicing multiple clients, this will have a significant impact on reputational risk, he continued. In the custody business, for example, client service is of critical importance and controls must be in-built to account for data discrepancies and increased risk exposure.

There is obviously more risk involved in manual processes and therefore automation and STP are one method of dealing with these risks, said Cox. “There is also a need for good quality people to be involved in the data management process but these people should not be put in a position where they often have to make judgement calls about data,” he explained.

He warned delegates that there is “no hiding place” from problems caused by data, as institutions cannot simply blame vendors: the institution must take the responsibility for its services. To make sure the data is correct, tolerance levels must be set and data must be regularly checked, he explained. “Checks and controls must be built around the areas that the most problems occur and where the risks are greatest. Finger on the pulse rather than single snapshots allow institutions to react in a more timely manner.”

It is also unrealistic to expect 100% integrity of data from vendors, he contended, as inaccuracies can be down to issues with the underlying source data.

BNY Mellon uses a data vendor scorecard with red, amber and green scores to measure the metrics being met (or not) by its vendors. “The facts speak for themselves in this way and we have control over our vendor relationships – we can prove that they need to improve in certain areas with hard evidence,” Cox explained.

Reprising Largier’s earlier point, Cox also discussed the benefits of working in partnership with the vendor community and producing detailed service level agreements to adequately measure performance at a basic level.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to organise, integrate and structure data for successful AI

Artificial intelligence (AI) is increasingly being rolled out across financial institutions, being put to work in applications that are transforming everything from back-office data management to front-office trading platforms. The potential for AI to bring further cost-savings and operational gains are limited only by the imaginations of individual organisations. What they all require to achieve...

BLOG

Implementing Technology Business Management with Pace and Precision

By Simon Mendoza, Chief Technology Officer, Calero. Implementing a Technology Business Management (TBM) platform can feel like a major logistical challenge. Every organisation starts from a different place – different data maturity, internal priorities and levels of stakeholder engagement. But that doesn’t mean every implementation needs to be a blank slate. The fastest and most...

EVENT

AI in Capital Markets Summit London

Now in its 2nd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

Enterprise Data Management

The current financial crisis has highlighted that financial institutions do not have a sufficient handle on their data and has prompted many of these institutions to re-evaluate their approaches to data management. Moreover, the increased regulatory scrutiny of the financial services community during the past year has meant that data management has become a key...