About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Engineering Approach to Data Management Challenges of New Regulation is Best, Says HSBC’s Johnson

Subscribe to our newsletter

The deluge of regulatory change that is facing the data management community is best tackled by adopting an engineering approach, according to Chris Johnson, head of product management, market data services at HSBC Securities Services. Speaking at a recent Thomson Reuters pricing event as a data management representative rather than on behalf of HSBC, Johnson explained that data managers should pre-empt regulatory intervention by examining the real changes that are required as a part of new regulations and plan accordingly.

“Forthcoming regulatory initiatives are best tackled using an ‘engineering’ approach to establish what data changes are needed, if any, so that planning can take place in advance before it is too late to implement the changes properly,” he said. That way, firms can adapt their data systems to meet the requirements, rather than being overwhelmed by the potential repercussions. For example, examining the data management impacts of a regulation such as MiFID indicates that the biggest change required for fund managers is the alteration of a reporting data field.

Johnson added that the recent fines that have been handed out to firms are the result of operational issues rather than caused by fundamentally incorrect reference data. “In my view it is wrong to take a purely binary view that data is either right or wrong, or high or low quality. In practice it is most likely just different. In my experience the number of actual data quality issues from vendors is extremely low, but apparently similar fields can actually serve different business purposes, therefore giving the appearance of being erroneous,” he elaborated.

“I have taken a keen interest in the root causes of fines that have been imposed on banks by regulators for transaction reporting errors,” he continued. “None that I have read about were caused by poor quality data from data vendors; all those instances I have seen were caused instead by operational errors.”

It is therefore rare to find fundamental data quality issues but the siloed nature of financial institutions’ systems is often the culprit when it comes to regulatory data infractions. Gaps within the overall data architecture due to workarounds and tactical approaches to new regulatory requirements exacerbate the problem. By this logic, in order to stay out of the regulatory spotlight, firms should focus on tackling these operational issues and understand the required output in terms of data items.

Attendees to the Thomson Reuters event were seemingly in agreement that accuracy of data was paramount going forward, with 60% of respondents to the interactive poll citing this as the most important criteria for judging quality. As for the other answers: completeness was cited by 17%, timeliness by 4%, consistency by 15% and provenance by 4%.

As previously noted by Reference Data Review, regulators are taking a keen interest in data quality, as evidenced by the Office of Financial Research proposals in the US financial services bill and the European Central Bank’s (ECB) reference data utility proposals.

However, many attendees to the event were not hopeful about the potentially positive impact of regulatory intervention in the space. A total of 46% of respondents to the interactive poll said they did not “expect much” from the regulatory changes going on across the globe. The majority, at 53%, were hopeful that it would have some impact in the future, but only 1% said it had already had a positive influence thus far.

Johnson thinks regulators should take a different tack than the proposed utilities: “In response to the ECB proposal for a new instrument data utility the best suggestion, in my view, was made by Angela Knight (the chair of the British Bankers Association) at an Xtrakter conference in the second quarter of 2009. When asked whether credit crunch would lead to investment in new data infrastructure, she replied that instead it would make more sense to concentrate on linking up existing databases rather than building new ones.”

He contended that one of the “quick wins” that could help resolve consistency and compatibility issues could be unique asset identification. “Neither ISINs nor Sedols are sufficiently granular to act as unique asset IDs however both Reuters and Bloomberg support ready-made and fit-for-purpose unique asset identifiers,” he told attendees. “But these cannot be used freely for data linkage purposes purely due to their proprietary vendor licensing restrictions. If these identifiers could be made open to all financial institutions many of the perceived data quality issues across the industry could disappear very quickly.”

This is not a new argument; many other industry participants have spoken out about proprietary data formats and vendors’ pricing practices in this regard. Johnson added: “Once asset identification has been tackled other fields, such as dates, rates and currencies, could then be addressed to drive out standards that can streamline and simplify the business processes, reduce the industry cost base and increase consistency and accuracy.”

The utility proposals have achieved something thus far, however, according to Johnson: “The ECB proposal for a utility has helped a great deal over the last two years by galvanising intelligent thought as to how the data standards issue can be addressed using existing technology through mandating, defining and instilling standards through existing systems and providers. The alternative would be to build expensive systems that do not necessarily solve the genuine underlying issues.”

After all, getting the industry issues talked about and the issues elaborated upon is half the battle.

Subscribe to our newsletter

Related content


Upcoming Webinar: Addressing conduct risk: approaches to surveillance

Date: 3 December 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Conduct risk in financial services is a critical area that requires vigilant monitoring and robust surveillance mechanisms. Regulatory bodies, (FCA, FINRA and others) have tightened their scrutiny and financial institutions must adopt advanced approaches to effectively manage and mitigate...


ValidMind Secures $8.1 Million for Model Risk Management and AI Governance Solutions Development

ValidMind has secured $8.1 million in a seed funding round. The investment will focus on developing model risk management and AI governance within the banking and financial services sectors. The company says the seed round was over-subscribed, demonstrating support from investors in its long-term vision to be the certifying authority for all AI solutions, starting...


TradingTech Summit London

Now in its 14th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.


Regulatory Data Handbook – Fifth Edition

In response to the popularity of the A-Team Regulatory Data Handbook, we have published a fifth edition outlining the essentials of regulations that are likely to have an impact on data and data management at your organisation. New to this edition is a section on RegTech, covering drivers behind the development of innovative regulatory technology,...