The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

DMS Review: Regulation, Data Complexity Seen as Key Drivers in Coming 18 Months

Regulation will be the main challenge for data managers over the next two years, but it won’t be the only challenge as data complexity and volumes increase, business users demand better data, and unstructured data becomes part of the big picture. Still more, these challenges must be tackled in an environment of cost constraint.

That was the message delegates heard from a panel of experts discussing key drivers for the data management segment over the next 18 months at the A-Team Data Management Summit (DMS) last week.

Chris Johnson, head of product management for Market Data Services at HSBC Securities Services, set the scene for the panel discussion in a quick review of the challenges and opportunities of key issues in the data management landscape, starting with data utilities and managed services. With regard to these, he noted benefits of data quality, compliance and efficiency, but also the challenges of collaboration between banks, which doesn’t come naturally, different levels of readiness of data utilities, and a lack of ‘pure’ data sources available or all to consume. He also commented on the liability of any utility and questioned whether and how it would be assigned.

Turning to data governance, Johnson said its benefits include oversight, efficiency and a better grasp of risk. In terms of challenges, he pointed to the need to align people and processes, establish standards, and manage content and control. On regulatory data, he said: “Industry investment to achieve complete and accurate reference data is badly needed and could provide benefits including long-term efficiencies, but we are challenged by a high level of regulatory expectation for data quality and a lack of time to comply with regulations.”

With the scene set, Andrew Delaney, editor-in-chief at A-Team Group, stepped up to moderate panel discussion on the drivers and challenges of data management. Responding to a question from Delaney about recent data management developments, Sally Hinds, director and co-founder of Data Management Consultancy Services, said: “Two years ago, we had no chief data officers and no utilities. Now, we have many and data management is taken seriously.”

Matt Cox, director at DenverPerry, added: “Looking back, there has been progress, but data management is still underdeveloped and there is a lag in our sector of industry. Everyone wants to do more, but they are constrained by cost and progress is slow.”

From a bank’s perspective, Robert Hofstetter, director and head of Securities Markets, Data and Control at Bank J. Safra Sarasin, said: “We face date complexity resulting from regulation, increasing volumes of data and reporting challenges. Over the past couple of years, the main challenge has been systems without architecture. We have built architecture, which has been a significant effort, and we have worked to increase our automation rate.”

Colin Gibson, head of data architecture in the Markets Division of Royal Bank of Scotland, added: “The most hassle we have had is consolidating trade transactions from multiple systems so that everyone can use them. There is demand for a central data solution, but it is a big challenge and is work in progress at the bank.”

Moving on to the issue of regulation, Delaney asked panel participants what regulatory requirements are making a big impact on data management. Hinds mentioned the volume of work and large outreach programmes needed to collect the data necessary to get into compliance with the US Foreign Account Tax Compliance Act (Fatca). Staying with Fatca, Gibson noted the need for a good architecture in time for the sons of Fatca that are expected to emerge in countries outside the US.

Johnson cited Solvency II, the deadline for which is now expected to be January 2016 and which still requires a lot of work. He explained: “The themes and content of Solvency II have similarities and crossovers with other regulations, and there are several new data requirements that are still being resolved. Managing data internally within each organisation can be done well, but external requirements remain outside our control and can only be resolved through active collaboration between financial institutions. For example, we may need a specialist trade association to deal with and nail down data standards for regulatory reporting. Data licensing is also a problem as it is often geared to one point of use. We need to get to common data for regulatory reporting that is available without restriction to all relevant parties. >Until then, we will be shackled in what can be achieved.”

Considering whether there is a silver bullet in terms of one set of requirements to deliver successful data management, the panellists agreed that the basic need is to understand data, agree what it will be used for and build controls around the use case. Here, Johnson exampled the commercial importance of high quality reference data in the fields used for capital calculations in Solvency II. If data quality is poor, this could result in an insurance firm having an erroneous capital charge.

The challenge of unstructured data may be small, but it is growing. Hinds said there are tools in the market to pool data, while Gibson noted the need to tag unstructured data in a similar way to other data so that it can be used for multiple purposes in downstream systems.

Posing a final question, Delaney asked the panel to identify additional data management challenges and to explain how they are being addressed. Among the worries mentioned were data quality, complexity, transparency, timeliness, efficiency, automation, investment, transition from information management to business intelligence, and the need to get basics such as counterparty and product identifiers right. No single solution can allay these worries, but panellists agreed that understanding data ownership is a good start in terms of responding to business needs, identifying decision makers, seeking investment when required and communicating with senior management.

On cost, Gibson concluded: “This is a different business compared to two years ago. We can no longer afford the technology estate we built up in the good times. Change could be an opportunity to make data quality and management better.”

Related content

WEBINAR

Recorded Webinar: Getting ready for Sustainable Finance Disclosure Regulation (SFDR) and ESG – what action should asset managers be taking now?

Interest in Environmental, Social and Governance (ESG) investment has exploded in recent years, bringing with it regulation and a requirement for buy-side firms to develop ESG strategies and meet disclosure obligations. The sell-side can help here by integrating ESG data with traditional financial information, although the compliance burden remains with asset managers. The EU Sustainable...

BLOG

SteelEye Targets Buy Side with Three-Tiered Surveillance Suite

SteelEye has launched a three-tier suite of communications compliance capabilities aimed at addressing the regulatory obligations of any size of financial institution. SteelEye is hoping this flexibility of approach makes its record-keeping and surveillance platform more appealing to smaller buy-side players that often lack the budget or resource capability to implement extensive platform solutions. SteelEye’s...

EVENT

Data Management Summit USA Virtual

Now in its 11th year, the Data Management Summit USA Virtual explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...