About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Is no news in reference data outsourcing bad news?

Subscribe to our newsletter

Buy, build or outsource? The debate continues in the reference data world, as amply demonstrated in this issue of Reference Data Review.

Falling firmly into the “build” camp, DTCC is bringing the platform underpinning its corporate actions offering in-house and moving off the third-party Xcitek solution. In the “buy” category is Daiwa Securities, opting firmly for the “off the shelf” approach.

As for outsourcing, the confirmation that IBM is repositioning its proposition on managed reference data, and Accenture and Capco admitting long sales cycles and slow take-up, give cause for thought about the progress of the reference data outsourcing business. Despite the seemingly overwhelming logic of leveraging a one-to-many reference data management capability and thereby “unleashing scale economies” (as one provider puts it), it is no secret that the outsourcing suppliers in general are experiencing slower business growth than they were hoping for.

Where firms sit on the buy versus build debate is largely a matter of individual preference. Some will almost always go for in-house developed systems, others will almost always avoid re-inventing the wheel in favour of tapping into third party solutions – and a preference for one or the other tends to be a function of size, wealth and the existence or otherwise of a large, influential IT department.

The outsourcing issue is more complex. Certainly, tier two and three institutions are a good fit for a managed reference data service, because they are likely to want to offload non-core activity and minimise cost and complication where possible. But to build credible services, the providers need access to the expertise and technology of the tier one firms. And the bigger the firm, the bigger the cultural impact of an outsourcing decision, the tougher the task of migration and, most probably, the longer the process of agreeing the outsourcing deal in the first place.

Does it really matter if reference data outsourcing doesn’t take off? Well, it might. Duplicated effort between financial institutions is clearly a source of huge inefficiency. There’s not much competitive edge to be found in the very basic levels of data sourcing and processing. The data impact of future regulatory imposition might create workloads that even the biggest, richest firms can’t afford to handle without “unleashing scale economies”. It might be naïve to suggest institutions should take the outsourcing plunge for the good of the industry as a whole, and of course the providers seeking to make a business out of this are hardly charity cases, but for them to stay in the game long term, they will need a proper revenue incentive. You can’t offer scale economies if you don’t have scale. Observers compare the state of reference data outsourcing today with the state of business process outsourcing a few years ago, and predict that as it becomes more accepted, there could be explosive growth. It takes time for outsourcing businesses to mature, and for the users to appreciate that their providers have to make a decent turn on it. If that maturity doesn’t develop in the reference data outsourcing space, there could come a day when financial institutions regret not having given their support to the idea sooner.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

4 September 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and...

BLOG

Busy NeoXam Takes Aim at Private Market Data Challenges

It’s been a busy first half for French data and portfolio management technology provider NeoXam, with expansion of its Australian operations, an addition to its management team and strengthened partnerships with established clients. Amidst this busyness has been a focus on providing private-market data capabilities as buy-side firms increase their exposure to alternatives such as...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...