The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Is no news in reference data outsourcing bad news?

Subscribe to our newsletter

Buy, build or outsource? The debate continues in the reference data world, as amply demonstrated in this issue of Reference Data Review.

Falling firmly into the “build” camp, DTCC is bringing the platform underpinning its corporate actions offering in-house and moving off the third-party Xcitek solution. In the “buy” category is Daiwa Securities, opting firmly for the “off the shelf” approach.

As for outsourcing, the confirmation that IBM is repositioning its proposition on managed reference data, and Accenture and Capco admitting long sales cycles and slow take-up, give cause for thought about the progress of the reference data outsourcing business. Despite the seemingly overwhelming logic of leveraging a one-to-many reference data management capability and thereby “unleashing scale economies” (as one provider puts it), it is no secret that the outsourcing suppliers in general are experiencing slower business growth than they were hoping for.

Where firms sit on the buy versus build debate is largely a matter of individual preference. Some will almost always go for in-house developed systems, others will almost always avoid re-inventing the wheel in favour of tapping into third party solutions – and a preference for one or the other tends to be a function of size, wealth and the existence or otherwise of a large, influential IT department.

The outsourcing issue is more complex. Certainly, tier two and three institutions are a good fit for a managed reference data service, because they are likely to want to offload non-core activity and minimise cost and complication where possible. But to build credible services, the providers need access to the expertise and technology of the tier one firms. And the bigger the firm, the bigger the cultural impact of an outsourcing decision, the tougher the task of migration and, most probably, the longer the process of agreeing the outsourcing deal in the first place.

Does it really matter if reference data outsourcing doesn’t take off? Well, it might. Duplicated effort between financial institutions is clearly a source of huge inefficiency. There’s not much competitive edge to be found in the very basic levels of data sourcing and processing. The data impact of future regulatory imposition might create workloads that even the biggest, richest firms can’t afford to handle without “unleashing scale economies”. It might be naïve to suggest institutions should take the outsourcing plunge for the good of the industry as a whole, and of course the providers seeking to make a business out of this are hardly charity cases, but for them to stay in the game long term, they will need a proper revenue incentive. You can’t offer scale economies if you don’t have scale. Observers compare the state of reference data outsourcing today with the state of business process outsourcing a few years ago, and predict that as it becomes more accepted, there could be explosive growth. It takes time for outsourcing businesses to mature, and for the users to appreciate that their providers have to make a decent turn on it. If that maturity doesn’t develop in the reference data outsourcing space, there could come a day when financial institutions regret not having given their support to the idea sooner.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to optimise the business value of your data using agile data governance

Date: 10 March 2022 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data governance is transforming from a risk management and compliance tool with limited and prescriptive controls, to a solution that can help you optimise the business value of your data. In this role, data governance must scale to manage...

BLOG

A-Team Extends, Expands and Releases 2021/2022 Edition of Regulatory Data Handbook

Don’t miss the latest edition of A-Team Group’s Regulatory Data Handbook, a publication dedicated to helping you gain a full understanding of regulations related to your organisation from the details of requirements to best practice implementation. The 2021/2022 handbook covers more than 40 regulations, providing you with a detailed description of each regulation including its...

EVENT

Data Management Summit USA Virtual (Redirected)

The highly successful Data Management Summit USA Virtual was held in September 2020 and explored how sell side and buy side financial institutions are navigating the global crisis and adapting their data strategies to manage in today’s new normal environment.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...