The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

DMS Review: Regulation, Data Complexity Seen as Key Drivers in Coming 18 Months

Regulation will be the main challenge for data managers over the next two years, but it won’t be the only challenge as data complexity and volumes increase, business users demand better data, and unstructured data becomes part of the big picture. Still more, these challenges must be tackled in an environment of cost constraint.

That was the message delegates heard from a panel of experts discussing key drivers for the data management segment over the next 18 months at the A-Team Data Management Summit (DMS) last week.

Chris Johnson, head of product management for Market Data Services at HSBC Securities Services, set the scene for the panel discussion in a quick review of the challenges and opportunities of key issues in the data management landscape, starting with data utilities and managed services. With regard to these, he noted benefits of data quality, compliance and efficiency, but also the challenges of collaboration between banks, which doesn’t come naturally, different levels of readiness of data utilities, and a lack of ‘pure’ data sources available or all to consume. He also commented on the liability of any utility and questioned whether and how it would be assigned.

Turning to data governance, Johnson said its benefits include oversight, efficiency and a better grasp of risk. In terms of challenges, he pointed to the need to align people and processes, establish standards, and manage content and control. On regulatory data, he said: “Industry investment to achieve complete and accurate reference data is badly needed and could provide benefits including long-term efficiencies, but we are challenged by a high level of regulatory expectation for data quality and a lack of time to comply with regulations.”

With the scene set, Andrew Delaney, editor-in-chief at A-Team Group, stepped up to moderate panel discussion on the drivers and challenges of data management. Responding to a question from Delaney about recent data management developments, Sally Hinds, director and co-founder of Data Management Consultancy Services, said: “Two years ago, we had no chief data officers and no utilities. Now, we have many and data management is taken seriously.”

Matt Cox, director at DenverPerry, added: “Looking back, there has been progress, but data management is still underdeveloped and there is a lag in our sector of industry. Everyone wants to do more, but they are constrained by cost and progress is slow.”

From a bank’s perspective, Robert Hofstetter, director and head of Securities Markets, Data and Control at Bank J. Safra Sarasin, said: “We face date complexity resulting from regulation, increasing volumes of data and reporting challenges. Over the past couple of years, the main challenge has been systems without architecture. We have built architecture, which has been a significant effort, and we have worked to increase our automation rate.”

Colin Gibson, head of data architecture in the Markets Division of Royal Bank of Scotland, added: “The most hassle we have had is consolidating trade transactions from multiple systems so that everyone can use them. There is demand for a central data solution, but it is a big challenge and is work in progress at the bank.”

Moving on to the issue of regulation, Delaney asked panel participants what regulatory requirements are making a big impact on data management. Hinds mentioned the volume of work and large outreach programmes needed to collect the data necessary to get into compliance with the US Foreign Account Tax Compliance Act (Fatca). Staying with Fatca, Gibson noted the need for a good architecture in time for the sons of Fatca that are expected to emerge in countries outside the US.

Johnson cited Solvency II, the deadline for which is now expected to be January 2016 and which still requires a lot of work. He explained: “The themes and content of Solvency II have similarities and crossovers with other regulations, and there are several new data requirements that are still being resolved. Managing data internally within each organisation can be done well, but external requirements remain outside our control and can only be resolved through active collaboration between financial institutions. For example, we may need a specialist trade association to deal with and nail down data standards for regulatory reporting. Data licensing is also a problem as it is often geared to one point of use. We need to get to common data for regulatory reporting that is available without restriction to all relevant parties. >Until then, we will be shackled in what can be achieved.”

Considering whether there is a silver bullet in terms of one set of requirements to deliver successful data management, the panellists agreed that the basic need is to understand data, agree what it will be used for and build controls around the use case. Here, Johnson exampled the commercial importance of high quality reference data in the fields used for capital calculations in Solvency II. If data quality is poor, this could result in an insurance firm having an erroneous capital charge.

The challenge of unstructured data may be small, but it is growing. Hinds said there are tools in the market to pool data, while Gibson noted the need to tag unstructured data in a similar way to other data so that it can be used for multiple purposes in downstream systems.

Posing a final question, Delaney asked the panel to identify additional data management challenges and to explain how they are being addressed. Among the worries mentioned were data quality, complexity, transparency, timeliness, efficiency, automation, investment, transition from information management to business intelligence, and the need to get basics such as counterparty and product identifiers right. No single solution can allay these worries, but panellists agreed that understanding data ownership is a good start in terms of responding to business needs, identifying decision makers, seeking investment when required and communicating with senior management.

On cost, Gibson concluded: “This is a different business compared to two years ago. We can no longer afford the technology estate we built up in the good times. Change could be an opportunity to make data quality and management better.”

Related content


Recorded Webinar: Managing LIBOR transition

The clock is ticking on the decommissioning of LIBOR towards the end of this year, leaving financial institutions with little time to identify where LIBOR is embedded in their processes, assess alternative benchmarks and reference rates, and ensure a smooth transition. Some will be ahead of others, but the challenges are significant for all as...


IHS Markit Adds Data Dictionary for EDM Platform

IHS Markit has released a data dictionary designed to help business users gain greater visibility of their data. The Data Dictionary has a web-based interface that allows users to see what data is available within their organisation, and a portal that includes data governance metadata covering how the data is defined, where it has come...


RegTech Summit APAC Virtual

RegTech Summit APAC will explore the current regulatory environment in Asia Pacific, the impact of COVID on the RegTech industry and the extent to which the pandemic has acted a catalyst for RegTech adoption in financial markets.


ESG Handbook 2021

A-Team Group’s ESG Handbook 2021 is a ‘must read’ for all capital markets participants, data vendors and solutions providers involved in Environmental, Social and Governance (ESG) investing and product development. It includes extensive coverage of all elements of ESG, from an initial definition and why ESG is important, to existing and emerging regulations, data challenges...