About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Engineering Approach to Data Management Challenges of New Regulation is Best, Says HSBC’s Johnson

Subscribe to our newsletter

The deluge of regulatory change that is facing the data management community is best tackled by adopting an engineering approach, according to Chris Johnson, head of product management, market data services at HSBC Securities Services. Speaking at a recent Thomson Reuters pricing event as a data management representative rather than on behalf of HSBC, Johnson explained that data managers should pre-empt regulatory intervention by examining the real changes that are required as a part of new regulations and plan accordingly.

“Forthcoming regulatory initiatives are best tackled using an ‘engineering’ approach to establish what data changes are needed, if any, so that planning can take place in advance before it is too late to implement the changes properly,” he said. That way, firms can adapt their data systems to meet the requirements, rather than being overwhelmed by the potential repercussions. For example, examining the data management impacts of a regulation such as MiFID indicates that the biggest change required for fund managers is the alteration of a reporting data field.

Johnson added that the recent fines that have been handed out to firms are the result of operational issues rather than caused by fundamentally incorrect reference data. “In my view it is wrong to take a purely binary view that data is either right or wrong, or high or low quality. In practice it is most likely just different. In my experience the number of actual data quality issues from vendors is extremely low, but apparently similar fields can actually serve different business purposes, therefore giving the appearance of being erroneous,” he elaborated.

“I have taken a keen interest in the root causes of fines that have been imposed on banks by regulators for transaction reporting errors,” he continued. “None that I have read about were caused by poor quality data from data vendors; all those instances I have seen were caused instead by operational errors.”

It is therefore rare to find fundamental data quality issues but the siloed nature of financial institutions’ systems is often the culprit when it comes to regulatory data infractions. Gaps within the overall data architecture due to workarounds and tactical approaches to new regulatory requirements exacerbate the problem. By this logic, in order to stay out of the regulatory spotlight, firms should focus on tackling these operational issues and understand the required output in terms of data items.

Attendees to the Thomson Reuters event were seemingly in agreement that accuracy of data was paramount going forward, with 60% of respondents to the interactive poll citing this as the most important criteria for judging quality. As for the other answers: completeness was cited by 17%, timeliness by 4%, consistency by 15% and provenance by 4%.

As previously noted by Reference Data Review, regulators are taking a keen interest in data quality, as evidenced by the Office of Financial Research proposals in the US financial services bill and the European Central Bank’s (ECB) reference data utility proposals.

However, many attendees to the event were not hopeful about the potentially positive impact of regulatory intervention in the space. A total of 46% of respondents to the interactive poll said they did not “expect much” from the regulatory changes going on across the globe. The majority, at 53%, were hopeful that it would have some impact in the future, but only 1% said it had already had a positive influence thus far.

Johnson thinks regulators should take a different tack than the proposed utilities: “In response to the ECB proposal for a new instrument data utility the best suggestion, in my view, was made by Angela Knight (the chair of the British Bankers Association) at an Xtrakter conference in the second quarter of 2009. When asked whether credit crunch would lead to investment in new data infrastructure, she replied that instead it would make more sense to concentrate on linking up existing databases rather than building new ones.”

He contended that one of the “quick wins” that could help resolve consistency and compatibility issues could be unique asset identification. “Neither ISINs nor Sedols are sufficiently granular to act as unique asset IDs however both Reuters and Bloomberg support ready-made and fit-for-purpose unique asset identifiers,” he told attendees. “But these cannot be used freely for data linkage purposes purely due to their proprietary vendor licensing restrictions. If these identifiers could be made open to all financial institutions many of the perceived data quality issues across the industry could disappear very quickly.”

This is not a new argument; many other industry participants have spoken out about proprietary data formats and vendors’ pricing practices in this regard. Johnson added: “Once asset identification has been tackled other fields, such as dates, rates and currencies, could then be addressed to drive out standards that can streamline and simplify the business processes, reduce the industry cost base and increase consistency and accuracy.”

The utility proposals have achieved something thus far, however, according to Johnson: “The ECB proposal for a utility has helped a great deal over the last two years by galvanising intelligent thought as to how the data standards issue can be addressed using existing technology through mandating, defining and instilling standards through existing systems and providers. The alternative would be to build expensive systems that do not necessarily solve the genuine underlying issues.”

After all, getting the industry issues talked about and the issues elaborated upon is half the battle.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Best practices for compliance with EU Market Abuse Regulation

Date: 18 June 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes EU Market Abuse Regulation (MAR) came into force in July 2016, rescinding the previous Market Abuse Directive and replacing it with a significantly extended scope of regulatory obligations. Eight years later, and amid constant change in capital markets regulation,...

BLOG

The Long and Winding Road to T+1 – A 20+ Year Journey

By Adrian Sharp, Principal, Fairfield Insights. What a long, strange trip it’s been. (Weir, Garcia et al, 1970) Earlier this year, the SEC published the final rules shortening the settlement cycle for US securities from T+2 to T+1 with a target compliance date of May 28, 2024. The case for a shortened settlement cycle remains...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...