It comes as no surprise at all that the managed services panel at our Data Management Summit in London next week is the most oversubscribed. Everyone wants to be on it.
That reflects the mood in London that I’ve picked up on in recent months. The concept of outsourcing data management to a third party is now being perceived by the movers and shakers of our industry as something at least to be considered, if not fully embraced.
That’s a far cry from the lack of interest that dogged earlier attempts, from such pioneers as Greg Smith’s Cicada Cos., which gave up on their plans for a reference data utility a decade ago when it seemed there was little overlap in requirements among potential clients. Each client thus became a consulting project, squeezing out all potential for economies of scale and resultant savings all round.
Our DMS panel – reflecting the state of play – is crammed with managed services proponents and practitioners: Tom Dalglish of UBS, who’s been leading one of the most ambitious utility initiatives with the help of iGate and Patni (and Markit EDM?); Diarmuid O’Donovan, now CDO at Legal & General, who was instrumental in the lift-out of UBS Asset Management’s Luxembourg-based entity data platform to form Tech Mahindra’s new utility offering; Citisoft’s Jonathan Clarke, who is leading the Tech Mahindra initiative; Martijn Groot of Euroclear, which has partnered with SmartStream to offer the Central Data Utility platform; and Steve Cheng of RIMES Technologies, which has been offering the buy side managed reference data services for quite some time.
And still there are more. As I’ve mentioned before, Wipro has been working on an LEI utility concept, Goldensource has its own hybrid model, and NYSE Technologies has quietly recruited four of the largest sell-side firms (and Asset Control?) in support of a more wide-ranging reference data utility.
We’ll be discussing the pros and cons of the utility approach at the conference next Thursday. In a nutshell, however, protagonists reckon the utility model represents the last great chance to shave operational costs of data without losing, erm, utility (in the economic sense). Indeed, some have estimated industry savings of a wholesale adoption of managed/utility-based reference data services of upwards of 300 million (of your chosen major currency). That, combined with the belated acceptance that technologies like cloud may not after all pose a massive security threat if managed properly, appears to have turned the heads of data professionals. We’ll find out whether that’s the case next week.
On the flipside, utilities like this can only succeed if everyone plays ball. In an earlier life, I had the pleasure of trying (and failing) to get a market data utility concept off the ground, based on the then-somewhat flimsy dot.com technologies and delivery systems. High on the list of the many factors that made this proposition unfeasible at the time – the others include some of those listed above – was keeping content and functionality providers on board with the concept.
The whole thing falls down if partners break from the pack to secure their own deal with a client. This applies equally to information providers and those who provide technical infrastructure, all of which can come under tremendous pressure to do side deals that bring in immediate financial benefit. Once again, this will be one of the topics for discussion on our panel next week.
So, if you haven’t registered, I’d urge you to do so now. As well as learning a lot – I always do – and networking with your peers, you’ll also get the benefit of watching me dole out the first ever set of A-Team Data Management Awards.
See you next week!
Subscribe to our newsletter