About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Case Study: Deutsche Stresses Strong Governance Approach

Subscribe to our newsletter

Deutsche Bank expects to complete its wide-ranging DB Reference Data Programme – launched in 2002 and encompassing a single model for client, instrument and organizational data – by the first quarter of next year. In developing its plans for the project, Deutsche took care to acquire management buy-in, establishing a Corporate Reference Data Governance Forum to define a global strategy for reference data and to ensure that the strategy is implemented across the bank.

According to Neil Innes, of Deutsche’s global technology and operations group, the DB Reference Data Programme was aimed at enhancing data quality, optimizing investments in reference data projects and technology, eradicating duplication of sources, moving the organization to a single ‘golden source’ of core reference data, and ensuring that this golden copy was adopted by downstream applications.

Along the way, Innes and his team reached a set of valuable conclusions that can serve as a useful guide to enterprisewide reference data projects. Innes was speaking at Osney Media’s Financial Information Management conference a few weeks ago.

Deutsche reached the conclusion at a senior level in mid-2002 that it had issues with reference data, particularly with client and instrument data. Specifically, the bank’s approach to reference data heretofore had resulted in duplicate processes for handling the same data, duplicate investment in reference data processing and associated infrastructure, and the introduction of a variety of technology solutions specific to each business.

Imcompatible Formats

This had created an environment in which the same data was stored in many different locations but in incompatible or inconsistent formats. As such, the data could not be easily maintained or interrogated, leading to substantial reconciliation and conversion efforts internally. The bank concluded that this inconsistent approach to reference data could impact customer service levels, particularly for those Deutsche clients dealing with multiple business lines.

Recognizing the strategic importance of the issues at hand, Deutsche established the Corporate Reference Data Governance Forum in September 2002. The forum was charged with defining the new reference data strategy and ensuring it got implemented. The bank also acknowledged at the time that there were lessons to be learned from previous reference data programmes, and applied a number of key components to its strategy based on this prior experience.

Chief among these components was a strong reference governance model. The forum developed a strong top-down mission statement and established review bodies for any new investment in applications to ensure that any project was aligned with the use of the reference data programme’s golden sources.

The strategy sought to ensure that as well as improving data quality, each programme initiative improved the business process involved. Other components of the strategy included a decision to leverage established applications where possible, and to give responsibility for data to the individual business units, from IT and compliance previously, so that responsibility for, say, credit data was placed with credit business units.

Innes said the project to establish a golden source of client data was the most interesting of the programme. The bank uses a mix of internal and external data to aid in ‘know your client’ (KYC) efforts. Deutsche’s assessment was that its current approach risked substantial duplication of this data and the infrastructure used to manage it. Further, it saw that rapidly growing, regionally driven regulatory and compliance pressures were creating the need for absolute data quality and process for client data.

Consistent Process

Its approach was to develop a globally consistent, extensible workflow process for new client adoption. This would entail the creation of a single mechanism for client adoption within the bank, focusing the responsibility on this ‘gatekeeper’ and ensuring no client adoption function existed in downstream systems. It would also entail establishing a federated review of client data, ensuring that the people with the greatest interest in ensuring high levels of data quality control the data.

Using this process, the bank created what became the ‘golden source’ for client data. Any regional variations to that golden source would be considered only where driven by regulatory requirements and expressly not where driven by historic practice or preference. Innes said his team traveled widely to install these processes and practices, often having to deal with local political situations, but mindful of the need not to damage local business advantage.

Again, buy-in was secured by delivering business service improvements alongside the programme. For client data, this involved demonstrating that the project would improve the bank’s view of who its clients, provide timely execution of compliance, credit, legal and regulatory controls for client adoption, and would ensure compliance with KYC and Anti-Money Laundering (AML) regulations.

Within the instrument stream of the project, the golden source for instrument data was derived from data supplied by third-party vendors. Deutsche’s aim was to reduce direct data costs, infrastructure costs and manual data set-up and cleansing costs as well as eliminating duplication. The bank’s approach was to leverage an internal application that held 90% of its required instrument data. The programme’s main thrust was to build a clear governance and communication structure for using that platform, and then expanding coverage to fill in the gaps.

For the organization aspect of the programme, the task was to enhance an existing central repository for organizational and structural data, and ensure that operations were aware of it bankwide. This related mainly to data for internal reporting purposes.

With some months left to run on the project, Innes believes the bank has learned a number of strategic lessons. Innes described the governance aspect of the project as a “continual balancing act.” He said project members spent perhaps 60% of their time communicating the message.

Golden Source

The bank also learned that the golden source approach “does not mean a single depository of data but that multiple depositories of data can make up the golden data required,” Innes said. The three streams of the project, the bank found, although appearing to be individual initiatives, converged to be seen increasingly as a single ‘reference programme.’ Finally, Innes said, there was the realization that such projects are truly multi-year programmes.

Subscribe to our newsletter

Related content


Recorded Webinar: Tackling digital transformation challenges for operations teams

Digital transformation is a hot topic in capital markets, promising modernisation, better decisions, and faster time to market. It can deliver innovative front-office customer-facing applications and cloud solutions – but what is often forgotten is the role, and importance, of the back office in these transformative programs. Digitisation can only be as effective as the...


Data Fabrics Bring Speed and Agility. Just Make Sure Those Seams are Secure

The increasing complexity and volume of data used by enterprises has prompted as rethink among chief data officers about how best to manage it. The needs of more individuals within an organisation to access information that was once the preserve of a handful of managers, and the switch in emphasis from systems-derived to externally sourced...


TradingTech Summit Virtual (Redirected)

Trading Tech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.


ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...