About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Is no news in reference data outsourcing bad news?

Subscribe to our newsletter

Buy, build or outsource? The debate continues in the reference data world, as amply demonstrated in this issue of Reference Data Review.

Falling firmly into the “build” camp, DTCC is bringing the platform underpinning its corporate actions offering in-house and moving off the third-party Xcitek solution. In the “buy” category is Daiwa Securities, opting firmly for the “off the shelf” approach.

As for outsourcing, the confirmation that IBM is repositioning its proposition on managed reference data, and Accenture and Capco admitting long sales cycles and slow take-up, give cause for thought about the progress of the reference data outsourcing business. Despite the seemingly overwhelming logic of leveraging a one-to-many reference data management capability and thereby “unleashing scale economies” (as one provider puts it), it is no secret that the outsourcing suppliers in general are experiencing slower business growth than they were hoping for.

Where firms sit on the buy versus build debate is largely a matter of individual preference. Some will almost always go for in-house developed systems, others will almost always avoid re-inventing the wheel in favour of tapping into third party solutions – and a preference for one or the other tends to be a function of size, wealth and the existence or otherwise of a large, influential IT department.

The outsourcing issue is more complex. Certainly, tier two and three institutions are a good fit for a managed reference data service, because they are likely to want to offload non-core activity and minimise cost and complication where possible. But to build credible services, the providers need access to the expertise and technology of the tier one firms. And the bigger the firm, the bigger the cultural impact of an outsourcing decision, the tougher the task of migration and, most probably, the longer the process of agreeing the outsourcing deal in the first place.

Does it really matter if reference data outsourcing doesn’t take off? Well, it might. Duplicated effort between financial institutions is clearly a source of huge inefficiency. There’s not much competitive edge to be found in the very basic levels of data sourcing and processing. The data impact of future regulatory imposition might create workloads that even the biggest, richest firms can’t afford to handle without “unleashing scale economies”. It might be naïve to suggest institutions should take the outsourcing plunge for the good of the industry as a whole, and of course the providers seeking to make a business out of this are hardly charity cases, but for them to stay in the game long term, they will need a proper revenue incentive. You can’t offer scale economies if you don’t have scale. Observers compare the state of reference data outsourcing today with the state of business process outsourcing a few years ago, and predict that as it becomes more accepted, there could be explosive growth. It takes time for outsourcing businesses to mature, and for the users to appreciate that their providers have to make a decent turn on it. If that maturity doesn’t develop in the reference data outsourcing space, there could come a day when financial institutions regret not having given their support to the idea sooner.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to maximise the use of data standards and identifiers beyond compliance and in the interests of the business

Date: 18 July 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data standards and identifiers have become common currency in regulatory compliance, bringing with them improved transparency, efficiency and data quality in reporting. They also contribute to automation. But their value does not end here, with data standards and identifiers...

BLOG

Bloomberg Collaboration with Google Cloud Increases Access to Data And Analytics

Bloomberg continues to invest in cloud data access with an offering that enables customers of Google Cloud to accelerate their data strategies through the integration of Bloomberg’s cloud-based data management solution, Data License Plus (DL+), with BigQuery, Google Cloud’s fully managed, serverless data warehouse. The collaboration also allows mutual customers to access Bloomberg Data License...

EVENT

Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...