About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Capgemini Majors on Flexibility for Reference Data Outsourcing

Subscribe to our newsletter

Capgemini’s recently announced reference data management (RDM) services, which include business process outsourcing (BPO), are differentiated from competitive offerings by their flexibility, the consulting, technology and outsourcing provider claims. Capgemini will not be saying to clients, “here’s a solution – take it or leave it”, according to Saleel Nair, its global leader of reference data management BPO services, and is confident that this flexible approach will be successful, despite the fact that outsourced reference data services have to date proved less popular than predicted.

This is not the first time we’ve seen a big consultancy/outsourcing/offshoring specialist wade into the managed reference data arena, but none of the big names that have tried to capitalise on the opportunity so far have found the going as easy as hoped, with Accenture and IBM pulling back significantly, and, most recently, Capco offloading its managed reference data services arm to Netik. Convincing a critical mass of clients to take the outsourcing plunge and then building scale in the face of the often very specific requirements of individual customers has proved challenging – although new entrants do keep coming and there is still a strong belief that in commoditised areas, outsourced data management and utility services are the way forward, particularly if times are going to continue to be tough economically.

To be clear, BPO is only one part of the Capgemini RDM offering, unveiled officially at the FIMA show in New York earlier this month. Also on offer are strategic consulting services (under which Capgemini says it can help companies set up data governance, strategy and models) and technology capabilities (Capgemini says it can partner with technology solutions providers and help develop customer-specific solutions, as well as offering expertise in the area of data warehouses and metadata management, where it has some solutions to bring into play). The provider is essentially taking a set of best practices it has built up during recent years through its activities with financial markets firms in the reference data area to create a dedicated service proposition . One of the ways Capgemini reckons it has built up expertise in reference data is through its relationship with Telekurs Financial, and indeed the provider cites data providers, including primary data providers, among its target customers for RDM, as well as end users like asset managers. Offshoring capabilities are clearly attractive for use in activities such as data input by data providers.

According to Nair, Capgemini’s exposure to customers via its financial services business has enabled it to establish that there is “a strong demand for reference data solutions”, which “spans consulting, reference data modelling, data warehousing and business process outsourcing”. “During the FIMA event we talked to a lot of potential customers and it was clear there is considerable demand for capabilities in areas such as data governance, metadata management, data cleansing and data analytics.”

Few would argue with that, but with so many providers of technology and managed services for reference data, what does Capgemini believe it can bring to the table that is different? “Unlike our peers, we are not going to the marketplace with a platform solution for their reference data needs,” says Nair. “Rather, we will focus on best-of-breed and customer-specific solutions. We are not going to the market and saying, here is the platform, end-to-end – deploy it. That is too complex a process to propose to customers. We are taking a flexible approach. Customers can use our platform – or not – in which case we’ll build a tailored service using their applications.”

Capgemini’s BPO capabilities cover three areas. One is administration – data sourcing, data aggregation and data quality management. Another is data integrity services, which involves manual and automated data cleansing and metadata administration. “A lot of our customers have the requirement, because they have legacy systems and multiple databases, for data cleansing either on a one-time basis or an ongoing basis, in order to provide the data that their business users want,” Nair continues. “In many cases customers have already deployed solutions for data cleansing and they ask us to use those; they could be proprietary or third-party solutions. Our task is to build the tailored services into a cost-effective high quality offshore solution.”

Also on offer is BPO with Business Insight, which involves deployment of upstream and downstream business analytics combined with domain expertise to deliver “transformation in RDM”, according to Nair. “Business Insight unlocks the hidden value from basic BPO workflow to enable better, more informed decisions that result in measurable business outcomes that can contribute to top line growth for our clients and help manage risks better,” he says. “For example, we know from experience what the likely downstream issues created by poor reference data are. We can find out which sources are causing the problem, and that can influence data procurement. Or it could be a data governance issue – the wrong fields are being picked up which means there are problems with the structure of the data that have operational or data governance implications.”

Nair emphasises the tailored nature of the approach. “We are looking at very specific segments and customers; this could even mean point solutions specific to customer needs or single/multiple asset classes,” he says. “The pain areas are poor data quality, managing scale, time to market pressure for new products, being flexible to regulatory changes and the pressure to improve operational efficiency. We might identify for a customer that in a particular area its current data structure won’t allow it to be efficient, and in that case we can look again at the data governance and the data model. This is quite different to saying, here’s a solution, take it or leave it.”

Certainly, avoidance of imposing the limitations of a “one size fits all” solution on potential customers is a positive, and is differentiated from, say, taking out one firm’s reference data operations and then trying to lump other firms’ activities into the same shop, using the same processes, people and systems, or setting up one instance of an EDM solution in a data centre and trying to run multiple clients’ reference data operations on it. But there is still the question of whether the provision of individual, customised reference data services to each client can ever tap into the economies of scale that would enable the cost savings required to pass on significant cost efficiencies to clients.

Of course, Capgemini will be tapping into cheaper offshore resource, and its BPO services for reference data are just part of a much bigger BPO offering, meaning it can exploit “horizontal” BPO scale economies even if it takes time to built up true scale in reference data-specific BPO. The provider has what it calls a “Rightshore” model of delivery for BPO, says Nair. “We have RDM offshore centres of excellence in India and Poland, but we also have delivery centres in other parts of Asia (China, Australia, Philippines), in Latin America, and in other parts of the world. We have a strategy of looking at the right location for a given customer.”

Nair is realistic about the current market environment and the fact that its “RDM BPO is a relatively new service, which will logically take some time to reach maturity”, but, he says, Capgemini has customers in the pipeline for the data integrity services, and is involved in conversations with potential customers, some of which are existing users of other services.

Subscribe to our newsletter

Related content


Recorded Webinar: Best practices for creating an effective data quality control framework

Data quality is critical to capital markets processes from identifying counterparties to building customer relationships, regulatory reporting, and ultimately improving the bottom line. It can also be extremely difficult to achieve. One solution is a data quality control framework that includes an automated and systematic process that monitors the state of data quality and ensures...


Despite Downturn, Financial Services Firms Must Hold the Line on Data Management Investments

By Justin Llewellyn-Jones, head of capital markets for North America at Broadridge. As financial services firms craft strategies for what’s shaping up to be a challenging 2023, they must hold strong on one essential point: don’t cut back on investments in data management. Over the past several years, many financial services firms have launched initiatives...


Data Management Summit New York City

Now in its 12th year, the Data Management Summit (DMS) in New York brings together the North American, capital markets enterprise data management community, to explore the evolution of data strategy and how to leverage data to drive compliance and business insight.


Regulatory Data Handbook 2022/2023 – Tenth Edition

Welcome to the tenth edition of A-Team Group’s Regulatory Data Handbook, a publication that has tracked new regulations, amendments, implementation and data management requirements as regulatory change has impacted global capital markets participants over the past 10 years. This edition of the handbook includes new regulations and highlights some of the major regulatory interventions challenging...