About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Paying Lip Service to Data Quality is on its Way Out, Says Deutsche Bank’s Fletcher

Subscribe to our newsletter

The days of paying mere lip service to the goal of improving data quality and living with “spaghetti-like” data infrastructures are soon to be gone, said Deutsche Bank’s head of reference data services, Neil Fletcher, to those in attendance at last week’s Thomson Reuters event. His own firm is much more aware of the importance of data quality at a senior level and has therefore been compelled to embark on a long term project to ensure that all of its downstream systems get data from the same logical place on a global basis, he explained.

In order to help the bank understand “how all of the data fits together”, the bank is investing in a data virtualisation layer, which will also shield end users from the “chaos” of integration, he said. Deutsche Bank is beginning the project by focusing on its reference data, but will later tackle its transactional and positional data and other data types. For each data category the bank will be determining a level of ownership across all systems in the form of data stewards in order to police data quality.

Fletcher, who has also previously worked for Goldman Sachs and Citi, said that up until now, the industry has been wont to be reactive to data management challenges. “There may have been a degree of lip service paid to data management but it was not considered a true corporate asset,” he said.

This has all changed, however, as a result of the financial crisis and rather than dealing with data on an ad hoc basis, firms are now taking a more strategic approach. The driver is not directly around ROI or lowering costs either, although these remain important factors, according to Fletcher. A more holistic approach centres more specifically around improving data quality for the good of the firm, he said. “Business processes and risk management concerns are driving the change in the data mindset,” he said.

Fletcher pointed to the events following the collapse of Lehman Brothers and the struggle to get important counterparty data from the “spaghetti” of data systems as an example of why change is needed. “We got sponsorship from senior management because they now see data as a corporate asset,” he said.

It is hoped that Deutsche’s new system will enable real-time transformation of the data from the centralised hub into whatever format it is needed downstream. The virtualisation process and an enterprise data model should enable this, he contended.

Fletcher certainly has high hopes for the firm’s own project, which seems to be in a similar vein to the large scale data warehousing projects of old but tackled in a phased manner. The structure is that of an internal data cloud that sits separately from the downstream systems yet feeds into them. He noted that the firm would also contemplate using an external cloud in the future but would be very cautious about the data stored in such a structure. Cloud computing has been the cause of some debate in the past with regards to reference data and this implementation could potentially be used as something of a proving ground for its wider adoption.

He also reckons the climate for data management investment will persist for some time to come: “I think data quality will remain a driver for investment for the next two to three years.” Fletcher also noted that his firm would not likely need a chief data officer (CDO) any time soon, as it is at the start of the data management journey, but might re-evaluate this further down the line.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...

BLOG

Wilshire Indexes Selects Alveo Data-as-a-Service for Corporate Actions Data Management

Wilshire Indexes, provider of a global benchmark platform that helps institutional investors, asset managers and retail intermediaries solve benchmarking, portfolio construction, and risk management challenges, has selected Alveo’s Data-as-a-Service (DaaS) solution for corporate actions data management. The Alveo solution sources, cross-references, compares and validates corporate actions from multiple data vendors. In case of discrepancies, the...

EVENT

TradingTech Summit London

Now in its 13th year the TradingTech Summit London brings together the European trading technology capital markets industry, to explore how trading firms are innovating in today’s cloud and digital based environment to create flexible, scalable trading platforms to support speed to market and business agility.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...