About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Understanding Risk is Key to Effective Data Management, Says BNY M’s Cox

Subscribe to our newsletter

Institutions need to invest in automated processes and good quality staff in order to adequately assess their risk exposure within data management process, said BNY Mellon’s Matthew Cox, head of securities data management, EMEA. They also need to understand the internal risks that are posed by bad quality data and examine what vendors can and can’t provide in terms of data accuracy, he explained to the FIMA 2008 delegation.

“Identifying the source of inaccurate data is key to mitigating risk in the current environment,” he explained. “BNY Mellon does this by dual sourcing data and producing golden copy.”

All models have their issues, however, but “common sense” and stepping back from the problem rather than merely reacting to issues that arise is the best approach, he said. “We need to understand what the end client wants to use the data for and whether they pass that data onto another party,” Cox explained. “This is in order to more accurately measure client and financial risk, which must both be taken into account at the end of the day.”

When servicing multiple clients, this will have a significant impact on reputational risk, he continued. In the custody business, for example, client service is of critical importance and controls must be in-built to account for data discrepancies and increased risk exposure.

There is obviously more risk involved in manual processes and therefore automation and STP are one method of dealing with these risks, said Cox. “There is also a need for good quality people to be involved in the data management process but these people should not be put in a position where they often have to make judgement calls about data,” he explained.

He warned delegates that there is “no hiding place” from problems caused by data, as institutions cannot simply blame vendors: the institution must take the responsibility for its services. To make sure the data is correct, tolerance levels must be set and data must be regularly checked, he explained. “Checks and controls must be built around the areas that the most problems occur and where the risks are greatest. Finger on the pulse rather than single snapshots allow institutions to react in a more timely manner.”

It is also unrealistic to expect 100% integrity of data from vendors, he contended, as inaccuracies can be down to issues with the underlying source data.

BNY Mellon uses a data vendor scorecard with red, amber and green scores to measure the metrics being met (or not) by its vendors. “The facts speak for themselves in this way and we have control over our vendor relationships – we can prove that they need to improve in certain areas with hard evidence,” Cox explained.

Reprising Largier’s earlier point, Cox also discussed the benefits of working in partnership with the vendor community and producing detailed service level agreements to adequately measure performance at a basic level.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: An Agile Approach to Investment Management Platforms for Private Markets and the Total Portfolio View

Data and operations professionals at private market institutions face significant data and analytical challenges managing private assets data. With investors clamouring for advice and analysis of private markets in their search for returns, investment managers are looking at ways to gain a more meaningful view of risk and performance across all asset types held by...

BLOG

Rethinking Data Management in Financial Services: Virtualisation Over Static Storage

By Thomas McHugh, Co-Founder and Chief Executive, FINBOURNE Technology. In Financial Services (FS), data management has long been centred around traditional database storage. However, this approach is fundamentally misaligned with the nature of FS data, which is process-driven rather than static. The industry needs a shift in perspective – one that prioritises virtualisation over rigid...

EVENT

Data Licensing Forum 2025

The Data Licensing Forum will explore industry trends, themes and specific developments relating to licensing of financial market data from Exchanges and Data Vendors.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...