About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Paying Lip Service to Data Quality is on its Way Out, Says Deutsche Bank’s Fletcher

Subscribe to our newsletter

The days of paying mere lip service to the goal of improving data quality and living with “spaghetti-like” data infrastructures are soon to be gone, said Deutsche Bank’s head of reference data services, Neil Fletcher, to those in attendance at last week’s Thomson Reuters event. His own firm is much more aware of the importance of data quality at a senior level and has therefore been compelled to embark on a long term project to ensure that all of its downstream systems get data from the same logical place on a global basis, he explained.

In order to help the bank understand “how all of the data fits together”, the bank is investing in a data virtualisation layer, which will also shield end users from the “chaos” of integration, he said. Deutsche Bank is beginning the project by focusing on its reference data, but will later tackle its transactional and positional data and other data types. For each data category the bank will be determining a level of ownership across all systems in the form of data stewards in order to police data quality.

Fletcher, who has also previously worked for Goldman Sachs and Citi, said that up until now, the industry has been wont to be reactive to data management challenges. “There may have been a degree of lip service paid to data management but it was not considered a true corporate asset,” he said.

This has all changed, however, as a result of the financial crisis and rather than dealing with data on an ad hoc basis, firms are now taking a more strategic approach. The driver is not directly around ROI or lowering costs either, although these remain important factors, according to Fletcher. A more holistic approach centres more specifically around improving data quality for the good of the firm, he said. “Business processes and risk management concerns are driving the change in the data mindset,” he said.

Fletcher pointed to the events following the collapse of Lehman Brothers and the struggle to get important counterparty data from the “spaghetti” of data systems as an example of why change is needed. “We got sponsorship from senior management because they now see data as a corporate asset,” he said.

It is hoped that Deutsche’s new system will enable real-time transformation of the data from the centralised hub into whatever format it is needed downstream. The virtualisation process and an enterprise data model should enable this, he contended.

Fletcher certainly has high hopes for the firm’s own project, which seems to be in a similar vein to the large scale data warehousing projects of old but tackled in a phased manner. The structure is that of an internal data cloud that sits separately from the downstream systems yet feeds into them. He noted that the firm would also contemplate using an external cloud in the future but would be very cautious about the data stored in such a structure. Cloud computing has been the cause of some debate in the past with regards to reference data and this implementation could potentially be used as something of a proving ground for its wider adoption.

He also reckons the climate for data management investment will persist for some time to come: “I think data quality will remain a driver for investment for the next two to three years.” Fletcher also noted that his firm would not likely need a chief data officer (CDO) any time soon, as it is at the start of the data management journey, but might re-evaluate this further down the line.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Hearing from the Experts: AI Governance Best Practices

9 September 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes The rapid spread of artificial intelligence in the financial industry presents data teams with novel challenges. AI’s ability to harvest and utilize vast amounts of data has raised concerns about the privacy and security of sensitive proprietary data and the ethical...

BLOG

Data Quality Posing Obstacles to AI Adoption and Other Processes, say Reports

The rush to build artificial intelligence applications has hit a wall of poor quality data and data complexity that’s hindering them from taking advantage of the technology. Those barriers are also preventing firms from upgrading other parts of their tech stacks. A slew of surveys and comments by researchers and vendors paint a picture of...

EVENT

Data Licensing Forum 2025

The Data Licensing Forum will explore industry trends, themes and specific developments relating to licensing of financial market data from Exchanges and Data Vendors.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...