About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Opinion: The Final Frontier for Portfolio Analytics?

Subscribe to our newsletter

By Michel Lempicki, Corporate Development Director, StatPro

If one looks back 20 years, asset managers had only the back and front offices to contend with. ‘Middle office operations’ were in the hands of back office staff: it was perceived as being relatively low value work when compared to the front office.

The creation of a real ‘middle office’ function came about in part due to the fact that the front office team was receiving data from the back office, then transforming that data to fulfill missions for the front office in a way that the back office staff did not really understand. After a time, the new middle office started to interface with clients, generating reports for them and also for portfolio managers, third party administrators, institutions, consultants and so on.

With all those changes, the new middle office started to purchase its own tools. Until ten years ago, these tools were typically best-of-breed applications and only had to fulfill a single key function – for example, attribution or compliance or risk management. These tools were mainly connected to the back office application, but occasionally to front office applications when additional data was required. However, they were still very distinct applications in terms of functionality.

The result of this structure was a kind of ‘application spaghetti’, whereby teams from the front, middle and back offices were all interconnected and the question then arose: which data do I take from which application? Where is the best source of risk data, and likewise for performance and attribution?

With such a vast quantity data being circulated, this task was not easy. The portfolio manager was calculating his own performance data using his own application or spreadsheet, as he needed this information rapidly. The middle office had to wait for back office data to arrive, time that the front office didn’t have. As a result, staff across all three offices may not have had the same data at the same time, or computed in the same way. All three offices were using different applications and had different data sitting everywhere.

The result? Data anarchy.

The concept of the data warehouse was conceived in order to centralise the data to create a ‘gold copy’ – and subsequently to connect this gold copy with all the analytical applications being used across the investment management firm. This happened in the early 2000s, when application vendors were connecting all these tools to data warehouses but the tools themselves were still very separate in terms of functionality.

The new paradigm in the middle office analytics industry is that soon all the major vendors will be able to merge these middle office functions, thereby offering risk management, attribution and performance measurement on a single platform. This trend is exacerbated by the fact that in the middle offices of an increasing number of banks, the traditionally distinct teams of performance measurement, attribution and risk management are now merging under one senior manager in charge of all middle office analytics.

The other trend we are witnessing is that all those tools are being fed from back office applications which themselves are evolving. These applications, which traditionally only computed simple tasks such as NAV calculations, are now increasingly performing basic performance and attribution. Across the private wealth and asset management industries, we are witnessing change on three fronts: there is a consolidation of attribution, performance and risk management teams; some performance vendors are now embracing risk and vice verse; and back office providers are now trying to nibble at the middle office cake by providing basic performance measurement and attribution functionality.

So where does that leave the middle office vendor?

First, in terms of risk management, there is a need for an immense amount of data, that is both complex to purchase and compute. Risk management is being recognised as an integral part of the investment process, with an inherent need to understand how much risk a particular investment mandate or asset class is adding to a portfolio. As a result, the need for processes like Ex-Ante risk calculations protects middle office vendors from the back office provider.

Second, until recently, investment managers usually purchased several different analytical applications from multiple vendors – perhaps in order to serve different asset classes or geographies. The objective for vendors was to convince clients to replace the incumbent application with their own.

All that has changed. With the new breed of application agnostic platforms, different teams are able to connect with their own pre-computed data and the asset manager can connect with all the third party applications.

For example, an asset manager may have one back office application in London, but funds in Ireland or Luxembourg that source data from another application. The asset manager will therefore have at least two or three different applications that are all able to compute performance. The technology now exists to take all of this pre-computed performance data and deliver it on a standard application to all middle and front office venues. There is no need to modify existing applications, which can be located in different locations and connected via a SaaS platform.

The middle office vendor will survive, but single engine applications will come under increasing pressure as structures within asset managers evolve and technology advances.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unpacking Stablecoin Challenges for Financial Institutions

The stablecoin market is experiencing unprecedented growth, driven by emerging regulatory clarity, technological maturity, and rising global demand for a faster, more secure financial infrastructure. But with opportunity comes complexity, and a host of challenges that financial institutions need to address before they can unlock the promise of a more streamlined financial transaction ecosystem. These...

BLOG

Agentic AI Deployment Presents Potentially Dangerous Data ‘Trust Paradox’

Artificial intelligence deployment in capital markets’ data processes may be approaching an inflection point that, if not managed properly, could introduce dangerous risks to institutions’ operations. The growing deployment of anonymous agents has the potential to hardwire data errors into workflows, magnifying data weaknesses as the automating technology scales processes, according Informatica from Salesforce. The...

EVENT

ExchangeTech Summit London

A-Team Group, organisers of the TradingTech Summits, are pleased to announce the inaugural ExchangeTech Summit London on May 14th 2026. This dedicated forum brings together operators of exchanges, alternative execution venues and digital asset platforms with the ecosystem of vendors driving the future of matching engines, surveillance and market access.

GUIDE

Practical Applications of the Global LEI – Client On-Boarding and Beyond

The time for talking is over. The time for action is now. A bit melodramatic, perhaps, but given last month’s official launch of the global legal entity identifier (LEI) standard, practitioners are rolling up their sleeves and getting on with figuring out how to incorporate the new identifier into their customer and entity data infrastructures....