About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Soliton Signs Joho Capital; Expands Telekurs Fixed Income Interface

Subscribe to our newsletter

We’re certainly hearing more and more about outsourcing utilities for managing reference data coming onto the market. Just a quick flick through this issue will show SunGard planning their new Fame-based rMDS, and Accenture set to launch their MRDS service based on Asset Control’s platform. Capco also recently outlined its MDS plans following the acquisition of Iverson (so many MDS-related acronyms to remember!). IBM earlier in the year bought out a reference data management platform (actually based on underlying technology from Asset Control) from Dresdner and in the process took over Dresdner’s data management on an outsourced basis. And rumour has it that Deloitte & Touche are looking to get in on the act.

So what does it mean? As with most industries, the consolidation and partnerships that are leading to the outsourcing services being launched are a sign of development. Suppliers are becoming clearer about the real client needs, and acquisitions help to fill in the gaps and in many cases provide scale to meet those needs. The spate of recent acquisitions and partnerships seem to be signalling a beginning of maturity for the reference data industry. Whether looking for outsourcing or onsite solutions for data management, the mission-critical nature, and the financial, regulatory and reputational repercussions of any failure of this function, mean that financial institutions need to be sure they are engaging with a solid provider with sound financials. Scale and support is all-important. We’ve heard of a couple of instances of smaller service providers getting through the selection process only to be knocked out at the due diligence stage when they could not provide reassuring enough financials. The challenge for smaller outfits is how to either distinguish their offering as a true niche service that can complement other, larger services or how to align themselves with bigger players in the hopes of getting a partnership deal or even being bought. It is only the large vendors with clout that can get away with providing outsourced services. It makes sense that institutions should focus on their true business strengths and leave data management to vendors whose core business is exactly that. And there is a clear and logical argument for the cost savings, quality control, expertise and many other benefits, that can be gained through using a third-party offering a shared infrastructure. But the question is, will the user firms really go for it? Despite the logical arguments, users in this space have never been quick to sign up for full outsourcing deals. The risk of reliance on a third-party for such a crucial service, the loss of control over the service, and other factors, make it a tough decision for any management team. There’s usually a level of client liquidity on the vendor’s side required before most firms will consider handing over such a crucial function to them; kind of a chicken and egg situation. Another issue is how to take the shared infrastructure and tailor it to each individual client’s environment, workflow processes and downstream applications. This is probably why we are seeing the likes of consulting firm like Accenture becoming involved and companies like SunGard ramping up their consultancy services group. There’s lots of money to be made in consulting on a service that a bank has already committed to.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Streamlining trading and investment processes with data standards and identifiers

Financial institutions are integrating not only greater volumes of data for use across their organisation but also more varieties of data. As well, that data is being applied to more use cases than ever before, especially regulatory compliance and ESG integration. Due to this increased complexity of institutions’ data needs, however, information often arrives into...

BLOG

LSEG Wins Most Innovative Data Quality Initiative Award in A-Team Group Innovation Awards 2025

LSEG has won the Most Innovative Data Quality Initiative Award in A-Team Group’s Innovation Awards 2025 for its Tick History – PCAP, which was expanded this year to offer more than 400 feeds, with new coverage spanning 14 markets in the Americas, eight in the Asia-Pacific region and 76 in EMEA. These awards, now in...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...