About a-team Marketing Services

A-Team Insight Blogs

Talking Reference Data with Reference Data: Maximising Utility

Subscribe to our newsletter

It comes as no surprise at all that the managed services panel at our Data Management Summit in London next week is the most oversubscribed. Everyone wants to be on it.

That reflects the mood in London that I’ve picked up on in recent months. The concept of outsourcing data management to a third party is now being perceived by the movers and shakers of our industry as something at least to be considered, if not fully embraced.

That’s a far cry from the lack of interest that dogged earlier attempts, from such pioneers as Greg Smith’s Cicada Cos., which gave up on their plans for a reference data utility a decade ago when it seemed there was little overlap in requirements among potential clients. Each client thus became a consulting project, squeezing out all potential for economies of scale and resultant savings all round.

Our DMS panel – reflecting the state of play – is crammed with managed services proponents and practitioners: Tom Dalglish of UBS, who’s been leading one of the most ambitious utility initiatives with the help of iGate and Patni (and Markit EDM?); Diarmuid O’Donovan, now CDO at Legal & General, who was instrumental in the lift-out of UBS Asset Management’s Luxembourg-based entity data platform to form Tech Mahindra’s new utility offering; Citisoft’s Jonathan Clarke, who is leading the Tech Mahindra initiative; Martijn Groot of Euroclear, which has partnered with SmartStream to offer the Central Data Utility platform; and Steve Cheng of RIMES Technologies, which has been offering the buy side managed reference data services for quite some time.

And still there are more. As I’ve mentioned before, Wipro has been working on an LEI utility concept, Goldensource has its own hybrid model, and NYSE Technologies has quietly recruited four of the largest sell-side firms (and Asset Control?) in support of a more wide-ranging reference data utility.

We’ll be discussing the pros and cons of the utility approach at the conference next Thursday. In a nutshell, however, protagonists reckon the utility model represents the last great chance to shave operational costs of data without losing, erm, utility (in the economic sense). Indeed, some have estimated industry savings of a wholesale adoption of managed/utility-based reference data services of upwards of 300 million (of your chosen major currency). That, combined with the belated acceptance that technologies like cloud may not after all pose a massive security threat if managed properly, appears to have turned the heads of data professionals. We’ll find out whether that’s the case next week.

On the flipside, utilities like this can only succeed if everyone plays ball. In an earlier life, I had the pleasure of trying (and failing) to get a market data utility concept off the ground, based on the then-somewhat flimsy dot.com technologies and delivery systems. High on the list of the many factors that made this proposition unfeasible at the time – the others include some of those listed above – was keeping content and functionality providers on board with the concept.

The whole thing falls down if partners break from the pack to secure their own deal with a client. This applies equally to information providers and those who provide technical infrastructure, all of which can come under tremendous pressure to do side deals that bring in immediate financial benefit. Once again, this will be one of the topics for discussion on our panel next week.

So, if you haven’t registered, I’d urge you to do so now. As well as learning a lot – I always do – and networking with your peers, you’ll also get the benefit of watching me dole out the first ever set of A-Team Data Management Awards.

Register here.

See you next week!

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Streamlining trading and investment processes with data standards and identifiers

Financial institutions are integrating not only greater volumes of data for use across their organisation but also more varieties of data. As well, that data is being applied to more use cases than ever before, especially regulatory compliance and ESG integration. Due to this increased complexity of institutions’ data needs, however, information often arrives into...

BLOG

SmartStream Chief Jaffer Sees Rapid Change in Year at Helm

Just over a year into his tenure as chief executive of financial data automation provider SmartStream, Akber Jaffer finds himself surveying a data industry that’s changed enormously over his short time at the UK-based company. Chat GPT was a year old when he took on his new role and the artificial intelligence (AI) technology’s revolutionary...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...