The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Understanding Risk is Key to Effective Data Management, Says BNY M’s Cox

Institutions need to invest in automated processes and good quality staff in order to adequately assess their risk exposure within data management process, said BNY Mellon’s Matthew Cox, head of securities data management, EMEA. They also need to understand the internal risks that are posed by bad quality data and examine what vendors can and can’t provide in terms of data accuracy, he explained to the FIMA 2008 delegation.

“Identifying the source of inaccurate data is key to mitigating risk in the current environment,” he explained. “BNY Mellon does this by dual sourcing data and producing golden copy.”

All models have their issues, however, but “common sense” and stepping back from the problem rather than merely reacting to issues that arise is the best approach, he said. “We need to understand what the end client wants to use the data for and whether they pass that data onto another party,” Cox explained. “This is in order to more accurately measure client and financial risk, which must both be taken into account at the end of the day.”

When servicing multiple clients, this will have a significant impact on reputational risk, he continued. In the custody business, for example, client service is of critical importance and controls must be in-built to account for data discrepancies and increased risk exposure.

There is obviously more risk involved in manual processes and therefore automation and STP are one method of dealing with these risks, said Cox. “There is also a need for good quality people to be involved in the data management process but these people should not be put in a position where they often have to make judgement calls about data,” he explained.

He warned delegates that there is “no hiding place” from problems caused by data, as institutions cannot simply blame vendors: the institution must take the responsibility for its services. To make sure the data is correct, tolerance levels must be set and data must be regularly checked, he explained. “Checks and controls must be built around the areas that the most problems occur and where the risks are greatest. Finger on the pulse rather than single snapshots allow institutions to react in a more timely manner.”

It is also unrealistic to expect 100% integrity of data from vendors, he contended, as inaccuracies can be down to issues with the underlying source data.

BNY Mellon uses a data vendor scorecard with red, amber and green scores to measure the metrics being met (or not) by its vendors. “The facts speak for themselves in this way and we have control over our vendor relationships – we can prove that they need to improve in certain areas with hard evidence,” Cox explained.

Reprising Largier’s earlier point, Cox also discussed the benefits of working in partnership with the vendor community and producing detailed service level agreements to adequately measure performance at a basic level.

Related content

WEBINAR

Recorded Webinar: Data Standards – progress and case studies

Global data standards and identifiers are essential to business growth, market stability and cost reduction – but they can be challenging to implement, while a lack of consistency across jurisdictions has presented obstacles to global take-up. However, with regulators starting to sit up and take note, the issue of data standards is coming increasingly to...

BLOG

J.P. Morgan Steps Forward as First Validation Agent in Global LEI System

J.P. Morgan has become the first validation agent in the Global LEI System, signalling early interest in the scheme as a means of improving client onboarding and Know Your Customer (KYC) . The validation agent framework was introduced by GLEIF in September 2020 and is designed to help financial institutions improve customer experience, accelerate client...

EVENT

RegTech Summit London

Now in its 5th year, the RegTech Summit in London explores how the European financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2014

Welcome to the inaugural edition of the A-Team Regulatory Data Handbook. We trust you’ll find this guide a useful addition to the resources at your disposal as you navigate the maze of emerging regulations that are making ever more strenuous reporting demands on financial institutions everywhere. In putting the Handbook together, our rationale has been...