About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Engineering Approach to Data Management Challenges of New Regulation is Best, Says HSBC’s Johnson

Subscribe to our newsletter

The deluge of regulatory change that is facing the data management community is best tackled by adopting an engineering approach, according to Chris Johnson, head of product management, market data services at HSBC Securities Services. Speaking at a recent Thomson Reuters pricing event as a data management representative rather than on behalf of HSBC, Johnson explained that data managers should pre-empt regulatory intervention by examining the real changes that are required as a part of new regulations and plan accordingly.

“Forthcoming regulatory initiatives are best tackled using an ‘engineering’ approach to establish what data changes are needed, if any, so that planning can take place in advance before it is too late to implement the changes properly,” he said. That way, firms can adapt their data systems to meet the requirements, rather than being overwhelmed by the potential repercussions. For example, examining the data management impacts of a regulation such as MiFID indicates that the biggest change required for fund managers is the alteration of a reporting data field.

Johnson added that the recent fines that have been handed out to firms are the result of operational issues rather than caused by fundamentally incorrect reference data. “In my view it is wrong to take a purely binary view that data is either right or wrong, or high or low quality. In practice it is most likely just different. In my experience the number of actual data quality issues from vendors is extremely low, but apparently similar fields can actually serve different business purposes, therefore giving the appearance of being erroneous,” he elaborated.

“I have taken a keen interest in the root causes of fines that have been imposed on banks by regulators for transaction reporting errors,” he continued. “None that I have read about were caused by poor quality data from data vendors; all those instances I have seen were caused instead by operational errors.”

It is therefore rare to find fundamental data quality issues but the siloed nature of financial institutions’ systems is often the culprit when it comes to regulatory data infractions. Gaps within the overall data architecture due to workarounds and tactical approaches to new regulatory requirements exacerbate the problem. By this logic, in order to stay out of the regulatory spotlight, firms should focus on tackling these operational issues and understand the required output in terms of data items.

Attendees to the Thomson Reuters event were seemingly in agreement that accuracy of data was paramount going forward, with 60% of respondents to the interactive poll citing this as the most important criteria for judging quality. As for the other answers: completeness was cited by 17%, timeliness by 4%, consistency by 15% and provenance by 4%.

As previously noted by Reference Data Review, regulators are taking a keen interest in data quality, as evidenced by the Office of Financial Research proposals in the US financial services bill and the European Central Bank’s (ECB) reference data utility proposals.

However, many attendees to the event were not hopeful about the potentially positive impact of regulatory intervention in the space. A total of 46% of respondents to the interactive poll said they did not “expect much” from the regulatory changes going on across the globe. The majority, at 53%, were hopeful that it would have some impact in the future, but only 1% said it had already had a positive influence thus far.

Johnson thinks regulators should take a different tack than the proposed utilities: “In response to the ECB proposal for a new instrument data utility the best suggestion, in my view, was made by Angela Knight (the chair of the British Bankers Association) at an Xtrakter conference in the second quarter of 2009. When asked whether credit crunch would lead to investment in new data infrastructure, she replied that instead it would make more sense to concentrate on linking up existing databases rather than building new ones.”

He contended that one of the “quick wins” that could help resolve consistency and compatibility issues could be unique asset identification. “Neither ISINs nor Sedols are sufficiently granular to act as unique asset IDs however both Reuters and Bloomberg support ready-made and fit-for-purpose unique asset identifiers,” he told attendees. “But these cannot be used freely for data linkage purposes purely due to their proprietary vendor licensing restrictions. If these identifiers could be made open to all financial institutions many of the perceived data quality issues across the industry could disappear very quickly.”

This is not a new argument; many other industry participants have spoken out about proprietary data formats and vendors’ pricing practices in this regard. Johnson added: “Once asset identification has been tackled other fields, such as dates, rates and currencies, could then be addressed to drive out standards that can streamline and simplify the business processes, reduce the industry cost base and increase consistency and accuracy.”

The utility proposals have achieved something thus far, however, according to Johnson: “The ECB proposal for a utility has helped a great deal over the last two years by galvanising intelligent thought as to how the data standards issue can be addressed using existing technology through mandating, defining and instilling standards through existing systems and providers. The alternative would be to build expensive systems that do not necessarily solve the genuine underlying issues.”

After all, getting the industry issues talked about and the issues elaborated upon is half the battle.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to develop a reporting framework for ESG disclosure regulation

ESG reporting is a challenge and additional burden for many financial institutions as regulations continue to evolve, ESG data management is complex, and global standards remain elusive. Helpful solutions include reporting frameworks that support the collection, understanding, and management of ESG data for disclosure. This webinar will provide practical guidance on how to build a...

BLOG

SIX Adds Crypto Reference Rates and Real-Time Indices

SIX has released SIX Reference Rate Crypto and SIX Real-Time Crypto indices that will serve as benchmarks for AsiaNext’s crypto derivatives trading platform as well as for institutional investors globally. The indices cover major crypto assets Bitcoin (BTC) and Ethereum (ETH), giving a comprehensive snapshot of the market and its performance. AsiaNext, founded by SIX...

EVENT

RegTech Summit New York

Now in its 8th year, the RegTech Summit in New York will bring together the regtech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...