About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Paying Lip Service to Data Quality is on its Way Out, Says Deutsche Bank’s Fletcher

Subscribe to our newsletter

The days of paying mere lip service to the goal of improving data quality and living with “spaghetti-like” data infrastructures are soon to be gone, said Deutsche Bank’s head of reference data services, Neil Fletcher, to those in attendance at last week’s Thomson Reuters event. His own firm is much more aware of the importance of data quality at a senior level and has therefore been compelled to embark on a long term project to ensure that all of its downstream systems get data from the same logical place on a global basis, he explained.

In order to help the bank understand “how all of the data fits together”, the bank is investing in a data virtualisation layer, which will also shield end users from the “chaos” of integration, he said. Deutsche Bank is beginning the project by focusing on its reference data, but will later tackle its transactional and positional data and other data types. For each data category the bank will be determining a level of ownership across all systems in the form of data stewards in order to police data quality.

Fletcher, who has also previously worked for Goldman Sachs and Citi, said that up until now, the industry has been wont to be reactive to data management challenges. “There may have been a degree of lip service paid to data management but it was not considered a true corporate asset,” he said.

This has all changed, however, as a result of the financial crisis and rather than dealing with data on an ad hoc basis, firms are now taking a more strategic approach. The driver is not directly around ROI or lowering costs either, although these remain important factors, according to Fletcher. A more holistic approach centres more specifically around improving data quality for the good of the firm, he said. “Business processes and risk management concerns are driving the change in the data mindset,” he said.

Fletcher pointed to the events following the collapse of Lehman Brothers and the struggle to get important counterparty data from the “spaghetti” of data systems as an example of why change is needed. “We got sponsorship from senior management because they now see data as a corporate asset,” he said.

It is hoped that Deutsche’s new system will enable real-time transformation of the data from the centralised hub into whatever format it is needed downstream. The virtualisation process and an enterprise data model should enable this, he contended.

Fletcher certainly has high hopes for the firm’s own project, which seems to be in a similar vein to the large scale data warehousing projects of old but tackled in a phased manner. The structure is that of an internal data cloud that sits separately from the downstream systems yet feeds into them. He noted that the firm would also contemplate using an external cloud in the future but would be very cautious about the data stored in such a structure. Cloud computing has been the cause of some debate in the past with regards to reference data and this implementation could potentially be used as something of a proving ground for its wider adoption.

He also reckons the climate for data management investment will persist for some time to come: “I think data quality will remain a driver for investment for the next two to three years.” Fletcher also noted that his firm would not likely need a chief data officer (CDO) any time soon, as it is at the start of the data management journey, but might re-evaluate this further down the line.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Taking a holistic approach to buy-side data management

Date: 7 February 2023 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes As data volumes and complexity continue to increase, buy-side data management is at an inflection point. Manual processes and data siloes are no longer fit for purpose, and firms need to take a more holistic approach to data management...

BLOG

Alveo Integrates QuantLib into Ops360 to Deliver No Code Curve Builder Capabilities

Alveo, a provider of cloud-based market data management services, has extended its Ops360 data operations solution with the integration of QuantLib for advanced analytics. The functionality allows any appropriately permissioned business user to drag and drop input data sets into functions and set parameters to derive data and kick off the calculation of curves and...

EVENT

Data Management Summit London

Now in its 13th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2022/2023 – Tenth Edition

Welcome to the tenth edition of A-Team Group’s Regulatory Data Handbook, a publication that has tracked new regulations, amendments, implementation and data management requirements as regulatory change has impacted global capital markets participants over the past 10 years. This edition of the handbook includes new regulations and highlights some of the major regulatory interventions challenging...