Buy, build or outsource? The debate continues in the reference data world, as amply demonstrated in this issue of Reference Data Review.
Falling firmly into the “build” camp, DTCC is bringing the platform underpinning its corporate actions offering in-house and moving off the third-party Xcitek solution. In the “buy” category is Daiwa Securities, opting firmly for the “off the shelf” approach.
As for outsourcing, the confirmation that IBM is repositioning its proposition on managed reference data, and Accenture and Capco admitting long sales cycles and slow take-up, give cause for thought about the progress of the reference data outsourcing business. Despite the seemingly overwhelming logic of leveraging a one-to-many reference data management capability and thereby “unleashing scale economies” (as one provider puts it), it is no secret that the outsourcing suppliers in general are experiencing slower business growth than they were hoping for.
Where firms sit on the buy versus build debate is largely a matter of individual preference. Some will almost always go for in-house developed systems, others will almost always avoid re-inventing the wheel in favour of tapping into third party solutions – and a preference for one or the other tends to be a function of size, wealth and the existence or otherwise of a large, influential IT department.
The outsourcing issue is more complex. Certainly, tier two and three institutions are a good fit for a managed reference data service, because they are likely to want to offload non-core activity and minimise cost and complication where possible. But to build credible services, the providers need access to the expertise and technology of the tier one firms. And the bigger the firm, the bigger the cultural impact of an outsourcing decision, the tougher the task of migration and, most probably, the longer the process of agreeing the outsourcing deal in the first place.
Does it really matter if reference data outsourcing doesn’t take off? Well, it might. Duplicated effort between financial institutions is clearly a source of huge inefficiency. There’s not much competitive edge to be found in the very basic levels of data sourcing and processing. The data impact of future regulatory imposition might create workloads that even the biggest, richest firms can’t afford to handle without “unleashing scale economies”. It might be naïve to suggest institutions should take the outsourcing plunge for the good of the industry as a whole, and of course the providers seeking to make a business out of this are hardly charity cases, but for them to stay in the game long term, they will need a proper revenue incentive. You can’t offer scale economies if you don’t have scale. Observers compare the state of reference data outsourcing today with the state of business process outsourcing a few years ago, and predict that as it becomes more accepted, there could be explosive growth. It takes time for outsourcing businesses to mature, and for the users to appreciate that their providers have to make a decent turn on it. If that maturity doesn’t develop in the reference data outsourcing space, there could come a day when financial institutions regret not having given their support to the idea sooner.
Subscribe to our newsletter