About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Citi Blazes a Trail for Counterparty Data Management, Berlanger Provides Update Four Years on

Subscribe to our newsletter

It has been four years since Citi first began its counterparty data management project as part of its overall push in the enterprise data management (EDM) space, which at the time was spearheaded by former chief data officer (CDO) John Bottega. The initial focus was on achieving a unique identifier for entity data across the firm as a whole and in order to achieve that, Citi managed to “syndicate the vision” of entity data, explained Geert Berlanger, director in charge of data quality at Citi, to attendees to A-Team Group’s recent seminar on counterparty risk.

All of the firm’s business units are now aware of the importance of counterparty data quality and are beginning to feed back requests in terms of new data items to be added, said Berlanger. “This is a completely different dynamic from when the project first started. The business is engaged in the data quality initiative and is actively involved in the process,” he added.

Citi is now able to take this downstream feedback and build new controls into the data management process in order to ensure a consistent level of data quality. This is in keeping with the firm’s overall focus on keeping end users engaged in the process, as elaborated upon by former global head of customer data Julia Sutton during her many discussions at FIMA over the last four years.

Berlanger worked alongside Sutton, before she moved on to Royal Bank of Canada, and was heavily involved in the communication process with the internal end users. The team also spent a long time developing a certified gold copy customer database, in order to provide an intuitive, easy to use search system for end users. They worked with Microsoft FAST Search technology and Search Business Consulting (SBC) to integrate two customer information sources and achieve a golden copy record of this data earlier this year. The bank also implemented a reconciliation hub from software vendor Web Services Integration (WSI) in 2008: the Xceptor Product Suite Reconciliation Hub. The rollout was aimed at affording Citi more control over its data and increasing the time available for data analysis by automating its retrieval.

The latter part of the project has involved working with Avox and Markit to normalise and validate Citi’s customer data and documentation. The focus on achieving a clean entity database means that the firm is now able to deal with different data feeds from a range of different vendors and be reassured that the mapping process can support this complexity, said Berlanger.

Kate Young, chief data architect at Avox, added that Citi is now a long way down the route to achieving a golden copy of entity data, as the focus is currently on keeping the data current. “The data must now be routinely maintained and the team has to ensure that records are created for every issuer as and when they are required. Everything has finally been mapped internally,” she concluded.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to maximise data sources created by MiFID II

Markets in Financial Instruments Directive II (MiFID II) creates new data sources that could be used to identify business opportunities and gain competitive edge. The sources include Approved Publication Arrangements (APA) and ESMA’s Financial Instruments Reference Data System (FIRDS). The regulation also mandates use of standard data, disaggregated market data feeds, and ISINs for OTC...

BLOG

The Case Against Ripping and Replacing: Why Capital Markets Firms Should Build Intelligence Into What They Already Have

By Neil Vernon, Chief Product Officer, Gresham. For years, capital markets firms have faced the same challenge: modernising sprawling, legacy data systems. Each attempt follows a familiar pattern – ambitious platform overhauls, eight-figure budgets, years of disruption – yet the old systems often remain in use long after the new ones are live. Replacing systems...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Enterprise Data Management, 2009 Edition

This year has truly been a year of change for the data management community. Regulators and industry participants alike have been keenly focused on the importance of data with regards to compliance and risk management considerations. The UK Financial Services Authority’s fining of Barclays for transaction reporting failures as a result of inconsistent underlying reference...