Deutsche Bank expects to complete its wide-ranging DB Reference Data Programme – launched in 2002 and encompassing a single model for client, instrument and organizational data – by the first quarter of next year. In developing its plans for the project, Deutsche took care to acquire management buy-in, establishing a Corporate Reference Data Governance Forum to define a global strategy for reference data and to ensure that the strategy is implemented across the bank.
According to Neil Innes, of Deutsche’s global technology and operations group, the DB Reference Data Programme was aimed at enhancing data quality, optimizing investments in reference data projects and technology, eradicating duplication of sources, moving the organization to a single ‘golden source’ of core reference data, and ensuring that this golden copy was adopted by downstream applications.
Along the way, Innes and his team reached a set of valuable conclusions that can serve as a useful guide to enterprisewide reference data projects. Innes was speaking at Osney Media’s Financial Information Management conference a few weeks ago.
Deutsche reached the conclusion at a senior level in mid-2002 that it had issues with reference data, particularly with client and instrument data. Specifically, the bank’s approach to reference data heretofore had resulted in duplicate processes for handling the same data, duplicate investment in reference data processing and associated infrastructure, and the introduction of a variety of technology solutions specific to each business.
This had created an environment in which the same data was stored in many different locations but in incompatible or inconsistent formats. As such, the data could not be easily maintained or interrogated, leading to substantial reconciliation and conversion efforts internally. The bank concluded that this inconsistent approach to reference data could impact customer service levels, particularly for those Deutsche clients dealing with multiple business lines.
Recognizing the strategic importance of the issues at hand, Deutsche established the Corporate Reference Data Governance Forum in September 2002. The forum was charged with defining the new reference data strategy and ensuring it got implemented. The bank also acknowledged at the time that there were lessons to be learned from previous reference data programmes, and applied a number of key components to its strategy based on this prior experience.
Chief among these components was a strong reference governance model. The forum developed a strong top-down mission statement and established review bodies for any new investment in applications to ensure that any project was aligned with the use of the reference data programme’s golden sources.
The strategy sought to ensure that as well as improving data quality, each programme initiative improved the business process involved. Other components of the strategy included a decision to leverage established applications where possible, and to give responsibility for data to the individual business units, from IT and compliance previously, so that responsibility for, say, credit data was placed with credit business units.
Innes said the project to establish a golden source of client data was the most interesting of the programme. The bank uses a mix of internal and external data to aid in ‘know your client’ (KYC) efforts. Deutsche’s assessment was that its current approach risked substantial duplication of this data and the infrastructure used to manage it. Further, it saw that rapidly growing, regionally driven regulatory and compliance pressures were creating the need for absolute data quality and process for client data.
Its approach was to develop a globally consistent, extensible workflow process for new client adoption. This would entail the creation of a single mechanism for client adoption within the bank, focusing the responsibility on this ‘gatekeeper’ and ensuring no client adoption function existed in downstream systems. It would also entail establishing a federated review of client data, ensuring that the people with the greatest interest in ensuring high levels of data quality control the data.
Using this process, the bank created what became the ‘golden source’ for client data. Any regional variations to that golden source would be considered only where driven by regulatory requirements and expressly not where driven by historic practice or preference. Innes said his team traveled widely to install these processes and practices, often having to deal with local political situations, but mindful of the need not to damage local business advantage.
Again, buy-in was secured by delivering business service improvements alongside the programme. For client data, this involved demonstrating that the project would improve the bank’s view of who its clients, provide timely execution of compliance, credit, legal and regulatory controls for client adoption, and would ensure compliance with KYC and Anti-Money Laundering (AML) regulations.
Within the instrument stream of the project, the golden source for instrument data was derived from data supplied by third-party vendors. Deutsche’s aim was to reduce direct data costs, infrastructure costs and manual data set-up and cleansing costs as well as eliminating duplication. The bank’s approach was to leverage an internal application that held 90% of its required instrument data. The programme’s main thrust was to build a clear governance and communication structure for using that platform, and then expanding coverage to fill in the gaps.
For the organization aspect of the programme, the task was to enhance an existing central repository for organizational and structural data, and ensure that operations were aware of it bankwide. This related mainly to data for internal reporting purposes.
With some months left to run on the project, Innes believes the bank has learned a number of strategic lessons. Innes described the governance aspect of the project as a “continual balancing act.” He said project members spent perhaps 60% of their time communicating the message.
The bank also learned that the golden source approach “does not mean a single depository of data but that multiple depositories of data can make up the golden data required,” Innes said. The three streams of the project, the bank found, although appearing to be individual initiatives, converged to be seen increasingly as a single ‘reference programme.’ Finally, Innes said, there was the realization that such projects are truly multi-year programmes.
Subscribe to our newsletter