About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Opinion: The Final Frontier for Portfolio Analytics?

Subscribe to our newsletter

By Michel Lempicki, Corporate Development Director, StatPro

If one looks back 20 years, asset managers had only the back and front offices to contend with. ‘Middle office operations’ were in the hands of back office staff: it was perceived as being relatively low value work when compared to the front office.

The creation of a real ‘middle office’ function came about in part due to the fact that the front office team was receiving data from the back office, then transforming that data to fulfill missions for the front office in a way that the back office staff did not really understand. After a time, the new middle office started to interface with clients, generating reports for them and also for portfolio managers, third party administrators, institutions, consultants and so on.

With all those changes, the new middle office started to purchase its own tools. Until ten years ago, these tools were typically best-of-breed applications and only had to fulfill a single key function – for example, attribution or compliance or risk management. These tools were mainly connected to the back office application, but occasionally to front office applications when additional data was required. However, they were still very distinct applications in terms of functionality.

The result of this structure was a kind of ‘application spaghetti’, whereby teams from the front, middle and back offices were all interconnected and the question then arose: which data do I take from which application? Where is the best source of risk data, and likewise for performance and attribution?

With such a vast quantity data being circulated, this task was not easy. The portfolio manager was calculating his own performance data using his own application or spreadsheet, as he needed this information rapidly. The middle office had to wait for back office data to arrive, time that the front office didn’t have. As a result, staff across all three offices may not have had the same data at the same time, or computed in the same way. All three offices were using different applications and had different data sitting everywhere.

The result? Data anarchy.

The concept of the data warehouse was conceived in order to centralise the data to create a ‘gold copy’ – and subsequently to connect this gold copy with all the analytical applications being used across the investment management firm. This happened in the early 2000s, when application vendors were connecting all these tools to data warehouses but the tools themselves were still very separate in terms of functionality.

The new paradigm in the middle office analytics industry is that soon all the major vendors will be able to merge these middle office functions, thereby offering risk management, attribution and performance measurement on a single platform. This trend is exacerbated by the fact that in the middle offices of an increasing number of banks, the traditionally distinct teams of performance measurement, attribution and risk management are now merging under one senior manager in charge of all middle office analytics.

The other trend we are witnessing is that all those tools are being fed from back office applications which themselves are evolving. These applications, which traditionally only computed simple tasks such as NAV calculations, are now increasingly performing basic performance and attribution. Across the private wealth and asset management industries, we are witnessing change on three fronts: there is a consolidation of attribution, performance and risk management teams; some performance vendors are now embracing risk and vice verse; and back office providers are now trying to nibble at the middle office cake by providing basic performance measurement and attribution functionality.

So where does that leave the middle office vendor?

First, in terms of risk management, there is a need for an immense amount of data, that is both complex to purchase and compute. Risk management is being recognised as an integral part of the investment process, with an inherent need to understand how much risk a particular investment mandate or asset class is adding to a portfolio. As a result, the need for processes like Ex-Ante risk calculations protects middle office vendors from the back office provider.

Second, until recently, investment managers usually purchased several different analytical applications from multiple vendors – perhaps in order to serve different asset classes or geographies. The objective for vendors was to convince clients to replace the incumbent application with their own.

All that has changed. With the new breed of application agnostic platforms, different teams are able to connect with their own pre-computed data and the asset manager can connect with all the third party applications.

For example, an asset manager may have one back office application in London, but funds in Ireland or Luxembourg that source data from another application. The asset manager will therefore have at least two or three different applications that are all able to compute performance. The technology now exists to take all of this pre-computed performance data and deliver it on a standard application to all middle and front office venues. There is no need to modify existing applications, which can be located in different locations and connected via a SaaS platform.

The middle office vendor will survive, but single engine applications will come under increasing pressure as structures within asset managers evolve and technology advances.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: The roles of cloud and managed services in optimising enterprise data management

Date: 14 May 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Cloud and managed services go hand-in-hand when it comes to modern enterprise data management (EDM), but how can the best solutions for the business be selected and implemented to ensure optimal data management based on accurate, consistent, and high-quality...

BLOG

GoldenSource Releases Market Risk Factor Data Standard, Eases FRTB Compliance

GoldenSource, a provider of Enterprise Data Management (EDM) and Master Data Management (MDM) solutions, has created a market risk factor data standard. Called Curve Master Definitions, the standard seeks to provide investment banks with a single risk factor taxonomy for market rates required to price OTC derivatives, including the storage and aggregation of industry standard...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Solvency II Data Management Handbook

Want to get a handle on Solvency II and what it means for data management? Need to make sure you have all the bases covered for the looming January 2016 deadline? Our Solvency II Data Management Handbook is now available for free download to help you. This Handbook is the ultimate guide to all things...