About a-team Marketing Services

A-Team Insight Blogs

Paying Lip Service to Data Quality is on its Way Out, Says Deutsche Bank’s Fletcher

Subscribe to our newsletter

The days of paying mere lip service to the goal of improving data quality and living with “spaghetti-like” data infrastructures are soon to be gone, said Deutsche Bank’s head of reference data services, Neil Fletcher, to those in attendance at last week’s Thomson Reuters event. His own firm is much more aware of the importance of data quality at a senior level and has therefore been compelled to embark on a long term project to ensure that all of its downstream systems get data from the same logical place on a global basis, he explained.

In order to help the bank understand “how all of the data fits together”, the bank is investing in a data virtualisation layer, which will also shield end users from the “chaos” of integration, he said. Deutsche Bank is beginning the project by focusing on its reference data, but will later tackle its transactional and positional data and other data types. For each data category the bank will be determining a level of ownership across all systems in the form of data stewards in order to police data quality.

Fletcher, who has also previously worked for Goldman Sachs and Citi, said that up until now, the industry has been wont to be reactive to data management challenges. “There may have been a degree of lip service paid to data management but it was not considered a true corporate asset,” he said.

This has all changed, however, as a result of the financial crisis and rather than dealing with data on an ad hoc basis, firms are now taking a more strategic approach. The driver is not directly around ROI or lowering costs either, although these remain important factors, according to Fletcher. A more holistic approach centres more specifically around improving data quality for the good of the firm, he said. “Business processes and risk management concerns are driving the change in the data mindset,” he said.

Fletcher pointed to the events following the collapse of Lehman Brothers and the struggle to get important counterparty data from the “spaghetti” of data systems as an example of why change is needed. “We got sponsorship from senior management because they now see data as a corporate asset,” he said.

It is hoped that Deutsche’s new system will enable real-time transformation of the data from the centralised hub into whatever format it is needed downstream. The virtualisation process and an enterprise data model should enable this, he contended.

Fletcher certainly has high hopes for the firm’s own project, which seems to be in a similar vein to the large scale data warehousing projects of old but tackled in a phased manner. The structure is that of an internal data cloud that sits separately from the downstream systems yet feeds into them. He noted that the firm would also contemplate using an external cloud in the future but would be very cautious about the data stored in such a structure. Cloud computing has been the cause of some debate in the past with regards to reference data and this implementation could potentially be used as something of a proving ground for its wider adoption.

He also reckons the climate for data management investment will persist for some time to come: “I think data quality will remain a driver for investment for the next two to three years.” Fletcher also noted that his firm would not likely need a chief data officer (CDO) any time soon, as it is at the start of the data management journey, but might re-evaluate this further down the line.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to maximise the use of data standards and identifiers beyond compliance and in the interests of the business

Data standards and identifiers have become common currency in regulatory compliance, bringing with them improved transparency, efficiency and data quality in reporting. They also contribute to automation. But their value does not end here, with data standards and identifiers being used increasingly for the benefit of the business. This webinar will survey the landscape of...

BLOG

The Potential and Pitfalls of Large Language Models

By Tony Seale, Knowledge Graph Engineer at Tier 1 Bank. Large Language Models (LLMs) like ChatGPT possess enormous power, stemming from their capability to ingest and compress vast amounts of general information gathered from the web. However, this capability is general rather than tailored to your specific business needs. To effectively utilise these models in...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

MiFID II Handbook

As the 3 January 2018 compliance deadline for Markets in Financial Instruments Directive II (MiFID II) approaches, A-Team Group has pulled together everything you need to know about the regulation in a precise and concise handbook. The MiFID II Handbook, commissioned by Thomson Reuters, provides a guide to aspects of the regulation that will have...