
By Gareth Evans, Chief Product Officer, FINBOURNE.
An uncomfortable truth: technology spend in asset management has surged 8.9% annually over the past five years across North America and Europe. But productivity? Flat. Cost as a share of assets under management (AUM)? No improvement. Operational expenses in other functions? Despite the promises that technology would create efficiencies, these haven’t contracted either.
McKinsey’s recent research – covering firms representing 70% of global AUM – confirms what many feared: there’s virtually no meaningful relationship between higher tech spend and improved productivity. The data is noisy, but the trendline is brutal. Asset managers pouring cash into technology aren’t consistently more productive than their peers on metrics like cost-to-AUM ratio or revenue per employee.
So, where’s all the money going?
The 80% problem
Most asset management firms allocate 60 per cent to 80 per cent of their technology budget just to keep the lights on. Maintaining legacy systems, running day-to-day operations, putting out fires, and so-on.
That leaves maybe 20 per cent to 40 per cent for transformation. And here’s where it gets worse – of that change-the-business budget, only 5 per cent to 10 per cent goes towards firm-wide digital transformation. The rest gets scattered across isolated use cases that sound great on paper but never scale.
Firms essentially spend years building technical debt faster than they can pay it down. Even after modernisation projects, they can’t bring themselves to decommission old systems. So they end up running both.
This is the vicious cycle. You keep spending to maintain what you have, building more debt, paying what amounts to a complexity tax in time and money. This problem compounds in asset management because most firms are running fragmented systems for different asset classes, with siloed data environments and no comprehensive platform. Integrating anything becomes a nightmare.
What if we’ve been solving the wrong problem?
Traditional wisdom says consolidate everything. Move data into lakes, centralise it in warehouses, build complex pipelines between systems. But here’s what actually happens – by the time you’ve migrated and reconciled everything, your data is already stale. Your AI models are working with yesterday’s information. And you’re locked into yet another rigid infrastructure that’ll need replacing in five years.
There’s a simpler approach that’s been evolving for two decades but is finally coming into its own: stop moving data around entirely. Data virtualisation means connecting to data in real-time across systems without moving or duplicating it. Instead of copying data across multiple platforms, you create live connections to where it already lives and make it accessible instantly.
For asset managers juggling multiple custody systems, brokers, and market or trade execution data vendors, this changes the game. No more reconciliation delays. No more “which version is the source of truth?” conversations. Just unified access to live data across your entire technology estate.
Connection alone isn’t enough
Here’s where it gets interesting, and where most firms stop short. Virtualisation gives you access to data wherever it lives. That’s the foundation. But the real power comes when you layer on a modern investment management platform that maintains bi-temporal records (which track both when something happened and when it was recorded) as well as full audit trails.
Now you can query data as it existed at any point in time. Understand exactly how positions and valuations evolved. Trace every calculation back to its source. Add real-time portfolio analytics and reporting on top of that virtualised layer, and suddenly you’re not just connecting systems – you’re enabling operational intelligence that wasn’t possible before.
This is what AI and analytics actually need to work: real-time access to trusted data, with governance and lineage built in from the start, enriched with the context that turns raw numbers into actionable insight.
The path forward
McKinsey’s research suggests asset managers could capture efficiencies equivalent to 25 per cent to 40 per cent of their total cost base through AI-enabled transformation. That’s enormous. But – and this is critical – only if they address the foundational gaps first.
Some firms are already seeing results. Streamlining investment accounting. Automating reconciliation (which McKinsey found can deliver up to 5 per cent efficiency gains). Accelerating fund launches. Improving data access across systems. But these gains don’t come from just deploying AI tools. They require domain-level reimagination, workflow rewiring, serious change management.
The best data strategy is often the simplest one: connect, don’t copy, govern, then operationalise. This may sound almost too straightforward given the complexity most firms are dealing with. But that’s precisely the point. We’ve overcomplicated data architecture to the point where 80 per cent of our budget goes to maintenance instead of innovation.
Asset managers are facing real margin pressure – pretax operating margins have declined by between 3 percentage points and 5 percentage points over the past five years. Technology should be part of the solution, not a cost centre that grows faster than revenue.
The firms that act decisively now – building the right data foundations, taking a strategic approach to AI, actually decommissioning legacy systems instead of running them in parallel – will pull ahead. The rest will keep burning cash to run in place, watching the gulf between technology investment and productivity continue to widen.
Subscribe to our newsletter



