About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit: The Challenges and Opportunities of Managed Services and Shared Utilities

Subscribe to our newsletter

Managed services and shared utilities designed for data management have the potential to cut banks’ costs, improve time to market and ease the burden of regulatory compliance, yet despite increasing maturity, these types of solutions face barriers to adoption including banks’ reluctance to change, concerns about relationships with data vendors and a lack of trust in the ability of utilities, particularly, to deliver on their promises.

The pros and cons of using managed services or utilities for data management were discussed during a lively panel session at A-Team Group’s recent Data Management Summit in London. A-Team editor, Sarah Underwood, moderated the session and was joined by Tom Dalglish, senior vice president and head of transformation services at SmartStream; Sally Hinds, enterprise data practice lead at Crossbridge; Nick Taplin, associate director at IGATE; Roy Williamson, global head of sales at Bloomberg PolarLake; and Neill Vanlint, global head of sales and services at GoldenSource.

The case for managed services

Talking first about why banks are considering managed services, Dalglish said: “Banks are looking at managed services for simple reasons of cost and a transition back to core services. They have already thrown all the ballast out of the balloon to lighten the load of operational IT spending. Couple this with the explosion of technology offers around managed services and there is tremendous opportunity. There is no need to imagine managed service capabilities any more, they are here.”

Other panel members agreed with the cost appeal of managed services, especially a reduction in ongoing run-the-bank costs. Hinds suggested gains could also be made by outsourcing low level data management and freeing up subject matter experts, who are already few in number, to tackle knowledge intense tasks.

Vanlint added: “Efficient organisations are looking to mutualise data management processes and inefficient organisations are looking to hand over their problems to others. There are tier one and tier two banks in both these camps. Smaller firms want functional relief and what they do depends on their pain points and how mature they are in deploying data management. Many of our existing clients want us to manage their platforms and we are also seeing the emergence of hedge funds as big adopters of managed services.”

While some firms are realising the potential of managed services, most are playing a waiting game to see what services will emerge and whether they will be dominated by a single provider. At a fundamental level of adoption, Dalglish said: “Many banks are not ready to give up their legacy systems. This is a continuing problem.” Taking a view on what sort of solutions may be attractive going forward, he added: “Mutualisation or consortium type plays must offer diversity, not just reference data management, but also other services such as corporate actions processing, reconciliations and data feed management.”

Answering an audience question about why utility services that have been talked about for years could now become a reality, Dalglish explained: “What has changed is the advent of tough regulatory requirements and banks’ desire to get out of the data business. There is now maturity in the utility technology stack and an opportunity for banks to take advantage of this and cut out data aggregators.” Hinds added: “Data as a service is a few years away, but it will come and everyone will want it. Banks will want pay as you go data for their users, which will challenge data vendors that offer subscription services.”

Benefits of cost and time

Considering the benefits of managed services and utilities, Williamson said: “These kinds of services provide the ability to wrap up pain points and solve problems that are not going away, such as legacy systems. They should take a progressive approach and solve point problems, deliver results quickly and then move on. The benefits are in cost and time.”

Commenting on managed services, Vanlint added: “Cost and compliance are key drivers. You can rent better capability with no upfront investment and select the right service to help achieve regulatory compliance quicker.” Taplin noted the ability to reduce the risks of new projects by using managed services and said the main benefits of utilities are in the provision of pre-packaged solutions. He explained: “Utilities offer faster time to market and a fairly static cost base as data is managed once and delivered to many customers, but this can be at the expense of the utility service not fitting exactly with a firm’s infrastructure.”

The emergence of the utility model proposes change not only for banks, but also for data providers. Vanlint explained: “Exchanges and data repositories are questioning why they should distribute their data through data vendors when they could plumb straight into utilities. They are looking for new markets for what they call their exhaust and are likely to work with utilities and take out the cost of the middle man. If trading venues distribute their own data, this will put pressure on data vendors to explain where they add value.”

Challenges remain

Turning to the challenges that still lie ahead for managed services and utilities, Dalglish noted banks’ reluctance to acknowledge that their data is not quite what it should be and let go of bad data. Williamson concurred with the cultural challenge and suggested managed services should accommodate both old and new data. Taplin added: “It is difficult for firms to make the business case for handing over data management to an unknown quantity. Only when utility providers are known quantities will this challenge be overcome.”

Panel members also noted challenges of downstream workarounds that may not work with new data, as well as the issue of integrating external services with internal operations. Vanlint commented: “The challenge is as much about workflow integration as it is about data.”

Discussing whether banks will be able to trust utility providers on an ongoing basis, Dalglish concluded: “To achieve trust we need to favour individuals and interactions over processes and tools, working software over mounds of documentation, collaboration with data vendors over contract negotiations and threats, and responsiveness to change over following plans. Not everything will be perfect, so we all need to communicate more effectively and address any issues up front.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Mastering Data Lineage for Risk, Compliance, and AI Governance

18 June 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Financial institutions are under increasing pressure to ensure data transparency, regulatory compliance, and AI governance. Yet many struggle with fragmented data landscapes, poor lineage tracking and compliance gaps. This webinar will explore how enterprise-grade data lineage can help capital markets participants...

BLOG

Ensuring Data Integrity in Finance – A Foundation for Efficiency and Trust

By Neil Sandle, director of product management at Gresham. In today’s financial landscape, data integrity is more than a regulatory requirement — it is the backbone of efficient operations and trustworthy decision making. Ensuring that data remains accurate, consistent, and reliable throughout its lifecycle is essential for financial institutions looking to maintain operational excellence, manage...

EVENT

ESG Data & Tech Briefing London

The ESG Data & Tech Briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...