The days of paying mere lip service to the goal of improving data quality and living with “spaghetti-like” data infrastructures are soon to be gone, said Deutsche Bank’s head of reference data services, Neil Fletcher, to those in attendance at last week’s Thomson Reuters event. His own firm is much more aware of the importance of data quality at a senior level and has therefore been compelled to embark on a long term project to ensure that all of its downstream systems get data from the same logical place on a global basis, he explained.
In order to help the bank understand “how all of the data fits together”, the bank is investing in a data virtualisation layer, which will also shield end users from the “chaos” of integration, he said. Deutsche Bank is beginning the project by focusing on its reference data, but will later tackle its transactional and positional data and other data types. For each data category the bank will be determining a level of ownership across all systems in the form of data stewards in order to police data quality.
Fletcher, who has also previously worked for Goldman Sachs and Citi, said that up until now, the industry has been wont to be reactive to data management challenges. “There may have been a degree of lip service paid to data management but it was not considered a true corporate asset,” he said.
This has all changed, however, as a result of the financial crisis and rather than dealing with data on an ad hoc basis, firms are now taking a more strategic approach. The driver is not directly around ROI or lowering costs either, although these remain important factors, according to Fletcher. A more holistic approach centres more specifically around improving data quality for the good of the firm, he said. “Business processes and risk management concerns are driving the change in the data mindset,” he said.
Fletcher pointed to the events following the collapse of Lehman Brothers and the struggle to get important counterparty data from the “spaghetti” of data systems as an example of why change is needed. “We got sponsorship from senior management because they now see data as a corporate asset,” he said.
It is hoped that Deutsche’s new system will enable real-time transformation of the data from the centralised hub into whatever format it is needed downstream. The virtualisation process and an enterprise data model should enable this, he contended.
Fletcher certainly has high hopes for the firm’s own project, which seems to be in a similar vein to the large scale data warehousing projects of old but tackled in a phased manner. The structure is that of an internal data cloud that sits separately from the downstream systems yet feeds into them. He noted that the firm would also contemplate using an external cloud in the future but would be very cautious about the data stored in such a structure. Cloud computing has been the cause of some debate in the past with regards to reference data and this implementation could potentially be used as something of a proving ground for its wider adoption.
He also reckons the climate for data management investment will persist for some time to come: “I think data quality will remain a driver for investment for the next two to three years.” Fletcher also noted that his firm would not likely need a chief data officer (CDO) any time soon, as it is at the start of the data management journey, but might re-evaluate this further down the line.
Subscribe to our newsletter