About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Is no news in reference data outsourcing bad news?

Subscribe to our newsletter

Buy, build or outsource? The debate continues in the reference data world, as amply demonstrated in this issue of Reference Data Review.

Falling firmly into the “build” camp, DTCC is bringing the platform underpinning its corporate actions offering in-house and moving off the third-party Xcitek solution. In the “buy” category is Daiwa Securities, opting firmly for the “off the shelf” approach.

As for outsourcing, the confirmation that IBM is repositioning its proposition on managed reference data, and Accenture and Capco admitting long sales cycles and slow take-up, give cause for thought about the progress of the reference data outsourcing business. Despite the seemingly overwhelming logic of leveraging a one-to-many reference data management capability and thereby “unleashing scale economies” (as one provider puts it), it is no secret that the outsourcing suppliers in general are experiencing slower business growth than they were hoping for.

Where firms sit on the buy versus build debate is largely a matter of individual preference. Some will almost always go for in-house developed systems, others will almost always avoid re-inventing the wheel in favour of tapping into third party solutions – and a preference for one or the other tends to be a function of size, wealth and the existence or otherwise of a large, influential IT department.

The outsourcing issue is more complex. Certainly, tier two and three institutions are a good fit for a managed reference data service, because they are likely to want to offload non-core activity and minimise cost and complication where possible. But to build credible services, the providers need access to the expertise and technology of the tier one firms. And the bigger the firm, the bigger the cultural impact of an outsourcing decision, the tougher the task of migration and, most probably, the longer the process of agreeing the outsourcing deal in the first place.

Does it really matter if reference data outsourcing doesn’t take off? Well, it might. Duplicated effort between financial institutions is clearly a source of huge inefficiency. There’s not much competitive edge to be found in the very basic levels of data sourcing and processing. The data impact of future regulatory imposition might create workloads that even the biggest, richest firms can’t afford to handle without “unleashing scale economies”. It might be naïve to suggest institutions should take the outsourcing plunge for the good of the industry as a whole, and of course the providers seeking to make a business out of this are hardly charity cases, but for them to stay in the game long term, they will need a proper revenue incentive. You can’t offer scale economies if you don’t have scale. Observers compare the state of reference data outsourcing today with the state of business process outsourcing a few years ago, and predict that as it becomes more accepted, there could be explosive growth. It takes time for outsourcing businesses to mature, and for the users to appreciate that their providers have to make a decent turn on it. If that maturity doesn’t develop in the reference data outsourcing space, there could come a day when financial institutions regret not having given their support to the idea sooner.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: End-to-End Lineage for Financial Services: The Missing Link for Both Compliance and AI Readiness

The importance of complete robust end-to-end data lineage in financial services and capital markets cannot be overstated. Without the ability to trace and verify data across its lifecycle, many critical workflows – from trade reconciliation to risk management – cannot be executed effectively. At the top of the list is regulatory compliance. Regulators demand a...

BLOG

Bloomberg BQuant Wins A-Team AICM Best AI Solution for Historical Data Analysis Award

When global markets were roiled by the announcement of massive US trade tariffs, Bloomberg saw the amount of financial and other data that runs through its systems surge to 600 billion data points, almost double the 400 billion it manages on an average day. “These were just mind-blowingly large volumes of data,” says James Jarvis,...

EVENT

AI in Data Management Summit New York City

Following the success of the 15th Data Management Summit NYC, A-Team Group are excited to announce our new event: AI in Data Management Summit NYC!

GUIDE

Regulation and Risk as Data Management Drivers

A-Team Group recently held a webinar on the topic of Regulation and Risk as Data Management Drivers. Fill in the form to get immediate access to the accompanying Special Report. Alongside death and taxes, perhaps the only other certainty in life is that regulation of the financial markets will increase in future years. How do...