About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Analysis: Has Data Management Outsourcing Lost Its Steam?

Subscribe to our newsletter

The zeal for providing outsourced reference data management solutions that we witnessed this time last year appears to have lost steam, with several of the larger supplier firms admitting that take-up is slower than expected, and some vendors shifting the focus of their efforts.

Major players, such as Accenture, SunGard, IBM, and Capco – all with significant funding to support their efforts – threw their weight behind solutions for sourcing, cleansing and distributing a wide range of reference data, with promises of significant economies of scale, reduction in duplication of effort and the other benefits outsourcing can provide. But a year on, little news has emerged of new deals supporting the model.

Accenture has been able to demonstrate perhaps the best level of take-up of its solution, built around the Asset Control data management platform, with Citadel and Wachovia as managed reference data services clients. Bill Cline, managing director of Accenture’s Capital Markets practice in North America and Asia-Pacific, says Accenture has a “strong pipeline”, but he admits the business has not developed as rapidly as he had anticipated it would. He calls for more early adopters to come on board and help Accenture fulfil its goal for the service – and benefit from tangible benefits in pricing by opting to be early users.

Cline says: “What excites me is the opportunity to create a true multi-source service with cross-feed validation, one that’s broad and deep and priced to scale. It is only through the one-to-many service combining technology and people that we can reach our goal, which is not just doing things cheaper, or even cleansing to a higher quality – it’s to make it more cost efficient for clients to cleanse larger numbers of securities at higher quality levels.” The early adopters are needed both to support innovation, and to help Accenture cost-justify the service, he says.

Accenture’s commitment to the managed reference data business remains, Cline says. He adds, however, that no company will sustain a multi-source service with just a handful of clients forever. “We are committed to staying in the game, but we won’t operate with an infinitely open timeframe. We continue to have a great pipeline and right now things look good. Our multi-year deal with Wachovia tells you a lot about our commitment to making this service a real success for the industry and for our clients. But to sustain a service over the longer term and evolve it in the manner we hope to evolve it, we need more early adopters to work with us, to enhance the service as quickly as possible.”

Meanwhile, IBM appears to be re-positioning its services, saying its approach to the provision of outsourced reference data management has “evolved”. According to an IBM spokesperson it is now focused on “specific proposi-tions that are going to be available to all firms (outsourced or not) and that will enable not only the trend for electronification and algorithmic trading, but will also cater for the demands of MiFID”. IBM promises a “very significant” announcement in this arena in early October, but will provide no further details at this stage.

This approach, however, is very different to IBM’s intentions when it entered the reference data outsourcing space with such fanfare in late 2003, buying out Dresdner Bank’s financial market information database and taking 24 of the German bank’s employees, as the basis for developing the solution (Reference Data Review, January 2004).

For some time, rumours have abounded about the deal failing to result in any significant business for IBM. Pleading client confidentiality, the IBM spokesperson offers no comment on whether the Dresdner reference data relationship itself will continue in light of IBM’s new positioning in the data outsourcing space.

But it appears that SunGard may have moved into IBM’s space, with Harold Finders, European head at SunGard, tentatively announcing an outsourcing relationship with Dresdner, during that vendor’s European client conference in Barcelona in June. More details on the Dresdner deal are expected in September, but at the time, Finders said the deal would include a risk management element, with SunGard providing reference data experts and data management.

Industry sources suggest the challenge encountered by IBM in the Dresdner case was that making the provision of outsourced data management pay by successfully bringing on additional customers and capitalising on scale economies, within a short enough timeframe, has been too difficult.

A plethora of recent research and analysis on the subject – not least Reference Data Review’s own survey, Reference Data Management – Who should handle it? – suggest that the industry’s appetite for outsourcing reference data management is growing. Should incidents like IBM’s re-positioning lead us to question this assessment?

Our survey found that 95 percent of respondents were “open-minded to the outsourcing concept”, indicating a willingness to consider it for their data management processes. It also found that “the recognition and enthusiasm for such solutions is, however, tempered somewhat by the reluctance of institutions – conservative by nature – to be pioneering and take that leap, preferring for ‘someone else’ to set a precedent for outsourced solutions”.

It looks like there have not been enough firms willing to take that leap yet. While a potential provider can offer a proven capability to attract additional clients, in order to make outsourcing provision a truly viable business it needs additional client revenue to justify servicing the original client. Yet the timescales involved in bringing new clients on board are described by one provider as “crippling”. Could it be that such factors will conspire to strangle the reference data outsourcing business in its infancy?

Unsurprisingly, other key providers in the space say not. According to Brian Lott, Capco partner and COO of Capco Reference Data Services, its pipeline on the outsourcing side today shows significant growth over where it was two years ago, and it has revised initial projections of turning over three or four new clients a year to expecting double that or more to come on board.

Capco will be rolling out the technology acquired from ING in the seven-year outsourcing deal announced earlier this year (Reference Data Review, March 2006) to its outsourced customer base from 2007 onwards. While not all deals will be in the seven year duration league, they are likely to be multi-year – three to four – rather than one year deals which, Lott says, are unlikely to del-iver the value the client is looking for.
But Lott admits that while the barriers that were impeding outsourcing progress two years ago, such as fear of loss of control and unwillingness to undertake the migration and release staff, are receding, a key impediment remains, and that is “time itself”. “Because of where we are in this process, and the fact that outsourcing is still a relatively new thing, until it becomes more commonplace, it will take a bit longer for firms to get on board,” he says. He reckons the sales cycle is a nine to 15 month process, depending on the size of the client.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...

BLOG

GoldenSource Releases Market Risk Factor Data Standard, Eases FRTB Compliance

GoldenSource, a provider of Enterprise Data Management (EDM) and Master Data Management (MDM) solutions, has created a market risk factor data standard. Called Curve Master Definitions, the standard seeks to provide investment banks with a single risk factor taxonomy for market rates required to price OTC derivatives, including the storage and aggregation of industry standard...

EVENT

RegTech Summit New York

Now in its 8th year, the RegTech Summit in New York will bring together the regtech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...