It has been four years since Citi first began its counterparty data management project as part of its overall push in the enterprise data management (EDM) space, which at the time was spearheaded by former chief data officer (CDO) John Bottega. The initial focus was on achieving a unique identifier for entity data across the firm as a whole and in order to achieve that, Citi managed to “syndicate the vision” of entity data, explained Geert Berlanger, director in charge of data quality at Citi, to attendees to A-Team Group’s recent seminar on counterparty risk.
All of the firm’s business units are now aware of the importance of counterparty data quality and are beginning to feed back requests in terms of new data items to be added, said Berlanger. “This is a completely different dynamic from when the project first started. The business is engaged in the data quality initiative and is actively involved in the process,” he added.
Citi is now able to take this downstream feedback and build new controls into the data management process in order to ensure a consistent level of data quality. This is in keeping with the firm’s overall focus on keeping end users engaged in the process, as elaborated upon by former global head of customer data Julia Sutton during her many discussions at FIMA over the last four years.
Berlanger worked alongside Sutton, before she moved on to Royal Bank of Canada, and was heavily involved in the communication process with the internal end users. The team also spent a long time developing a certified gold copy customer database, in order to provide an intuitive, easy to use search system for end users. They worked with Microsoft FAST Search technology and Search Business Consulting (SBC) to integrate two customer information sources and achieve a golden copy record of this data earlier this year. The bank also implemented a reconciliation hub from software vendor Web Services Integration (WSI) in 2008: the Xceptor Product Suite Reconciliation Hub. The rollout was aimed at affording Citi more control over its data and increasing the time available for data analysis by automating its retrieval.
The latter part of the project has involved working with Avox and Markit to normalise and validate Citi’s customer data and documentation. The focus on achieving a clean entity database means that the firm is now able to deal with different data feeds from a range of different vendors and be reassured that the mapping process can support this complexity, said Berlanger.
Kate Young, chief data architect at Avox, added that Citi is now a long way down the route to achieving a golden copy of entity data, as the focus is currently on keeping the data current. “The data must now be routinely maintained and the team has to ensure that records are created for every issuer as and when they are required. Everything has finally been mapped internally,” she concluded.
Subscribe to our newsletter