About a-team Marketing Services

A-Team Insight Blogs

Moving from Legacy Systems to an Interoperable Future – Why Innovation is No Longer Big Bang

Subscribe to our newsletter

In this Q&A, Stephen Collie, head of sales engineering at FINBOURNE Technology, discusses the core data management challenges paralysing the buy-side today, and explores the critical role of a Modern Financial Data Stack and incremental innovation in the transition to an interoperable future.

Q: Stephen, given FINBOURNE’s work with organisations across the global investment community, paint us a picture of the current landscape, as far as data challenges go?

It is clear to us that asset managers are at an inflection point. In a market with heightened volatility, inflation uncertainty and geopolitical disruption, being able to harness trusted and timely investment data, across positions, portfolio, risk and exposure is proving more critical than ever, while at the same time balancing the tightrope of cost optimisation and transparency.

A spaghetti of legacy systems and architectures, duplicated interfaces, and an accumulation of separate systems, have left asset managers struggling to meet the needs of a new data-hungry generation of investors. With the threat of new regulations bearing down and continued scrutiny over fees, the cost of operating at this new level of transparency – within the current status quo – is now too high to sustain.

Asset managers now have three options: scale up through M&A, diversify and innovate, or lower the cost of operating before it is too late. In our view, the first two have been in motion for some time now. The latter, we believe, is the one many buy-side firms have stalled on, viewing it as a luxury rather than a necessity.

Q: What market factors would you say are driving the necessity to innovate investment operations and data processes?

First, where cash was once king, now its transparency. Asset managers can slash fees but this will mean nothing if investor and regulatory demand for greater transparency cannot be met. Driving this is the rise of the activist investor and the democratisation of investing – namely direct indexing.

While performance is of course key, we are seeing a new generation of value-driven investors who want more granularity on their investments and are not afraid to take on their managers to get it. This requires firms to recognise and respond to the evolving investor profile quickly.

Second, supporting the fast-growing interest in sustainable investing and private markets assets is bringing many firms that are reliant on legacy tech to breaking point. With complex and unstructured data sets, private markets assets are proving tricky at best and impossible at worst. Firms must redefine the way their systems handle and understand these assets, in order to cash in on peaked interest.

Third, delivering more value and more data now comes with very real consequences, whether that is drawdowns or regulatory fines. This adds more cost and operational stress, making it particularly important to have a data foundation that understands and covers both public and private markets.

We know storing more data isn’t the answer. In fact, the industry has focused on hoarding data for too long and has now come to the jarring realisation that the capabilities for making disparate data sets integrated, consumable and meaningful, are missing. Much of this comes down to the impact of technical debt and a lack of appetite for innovation e.g. inertia.

In short, technical debt has led firms to experience data silos and inefficiencies, such as duplicate purchasing of market and reference data. It has contributed to the slow response and change management to regulation and market opportunities. And it has made it difficult for asset managers to meet investor due diligence and regulatory reporting. More than anything, it has stopped firms from adopting and leveraging emerging technologies needed to support future growth.

Q: For many years the industry has focused on consolidation to tackle technical debt, do you see the challenge differently?

Consolidation is certainly part of the puzzle, but we see the final and most critical piece to be the translation and interpretation of data across systems, so that different functions and entities in the investment ecosystem (from the EMS and PMS, to the fund administrator and custodian) can speak the same language. Effectively a Rosetta Stone for asset management.

Once this fundamental piece is in place, firms can understand their data consistently, derive value and meaning from it and empower multiple functions – from decision-making to analytics. Importantly, the resilience it creates eliminates inefficiencies across the investment ecosystem, for example allowing firms to avoid the need to make daily adjustments to strike the same shadow NAV as their custodian.

This all starts with owning and controlling the data to minimise operational cost and risk. The market drivers presently in force make this an opportune time to address technology innovation, breaking free from the cycle of best-in-class systems, outsourced solutions and sticking plasters, and moving towards what we call an interoperable Modern Financial Data Stack. It’s a model we feel can solve the very real data challenges that firms are facing today, while also shaping the future of global investment management.

It is important to state though, that innovation isn’t just about automation. It’s about enablement of your greatest asset – your people. SaaS technology can be an enabler too, and not a replacement of talent. I recently sat on an industry panel where I suggested that firms seldom hire super-skilled data scientists with the intention that they spend their day cleansing and scrubbing data. Being able to get these highly trained and talented individuals back to what they love doing and what will add the most value to the business is the kind of innovation everyone should be doing.

Q: In the past, innovating has often been seen as ‘big bang change’. Is this still the case, where do firms start and how can they minimise operational risk?

The approach we have taken counters big bang change, offering a strong and viable alternative to the multi-year transformational projects occupying the industry for the past few decades. Nevertheless, the confusion for firms can often be where to start. What does innovation look like? The simplest way to see it is that you don’t innovate by focusing on innovation; it’s an incremental objective based on your business goals. It is the outcome of all the small changes you make as an organisation, for example, addressing multiple IBORs, eliminating data siloes, tackling large volumes of complex data sets, which when combined achieve agility, efficiency and growth.

Having experienced first-hand the pain that technical debt and inertia can cause, my colleagues and I set down a path to address the data dilemma across the investment chain, while lowering the risk of operational change. Challenging traditional constructs, we designed a distinct new approach, in the form of a SaaS powered, cloud-native, interoperable data store.

We have met the low tolerance for risk by systematically bridging the gap between existing architecture and what we believe to be the Modern Financial Data Stack – supporting the very real need to translate between data formats and derive value and meaning from data. By providing an interoperable foundation, asset managers can easily plug-and-play their components together and connect to emerging technologies, such as AI and ML, to create meaningful analytics.

Q: How do firms evaluate the success of transformation and what pitfalls should they be aware of?

Successful transformation projects leverage cloud technologies to analyse and understand data that can be joined together seamlessly. It is about achieving a solid data foundation with domain knowledge that understands data, and enables the interpretation and translation of it, all while putting in place the right control and entitlements framework, to make data securely accessible across the organisation.

Unlike incumbent providers, who come at the problem from a workflow or functionality-first perspective (and find they have to address data issues later, within standalone data management offerings) we felt it was time to flip the model and concentrate on making the data accurate, timely and trusted. Building the data fabric from the bottom up allows organisational data to feed seamlessly and in real-time into investment capabilities built on top of functions such as portfolio management, order management and accounting.

Similarly, while rigid front-to-back systems mandate a data model, SaaS technology offers composability – meaning you no longer have to change your business model to fit the technology or translate your existing models into something the system understands.

Taking an API-first approach tackles the lack of timeliness and trust, helping to reproduce data with complete accuracy and confidence, every time. But it’s important to understand APIs are not a complete strategy in themselves and that connectivity is only a starting point. I see a lot of firms getting hung up on having an API strategy, which is a good start, but without the right control environment, entitlements protocol and domain knowledge, unlocking the potential of your data across your teams and functions will be difficult.

Q: Finally, it feels like legacy systems and subsequent data inefficiencies have been the topic du jour for over a decade now, will we still be talking about this in 2032?

This is true! As I said at the start, the industry is at an inflection point and I believe firms are now recognising the steps needed to manage data chaos and reach the nirvana state needed to survive today and thrive tomorrow. It is certainly encouraging to hear from prospects we speak to that there is a desire to promptly address challenges like multiple data models, the difficulty of achieving high-quality data, and how to gain value from complex data sets.

Given the market drivers in play, firms are now putting operational issues on the business agenda, whether that’s meeting investor and regulatory-led transparency with greater granularity across public and private assets, or demonstrating more value to investors through a digitalised client experience. More recently, stirred by financial and geopolitical shocks, firms are recognising the need for agility not just at the operational, but also business, level.

All of this is driving operational change further up the agenda and making it more visible.

Our view is that an interoperable SaaS investment data platform has the flexibility to achieve these priorities most efficiently and in the shortest time. It provides a trusted data fabric for investment operations, enabling them to regain control of operational efficiencies and move from operating at cost to operating at profit again. It also delivers optionality, so firms can be free to choose what services to build, buy or outsource, without having to maintain underpinning infrastructure.

However, starting the transition and moving towards an interoperable future requires firms to move on from a single source or front-to-back stack to enable the transparency and flexibility required. Single vendor dependency aside, one platform can never achieve the same level of innovation as that collectively available in the market, or complete the final piece of the puzzle – translating and gaining true value from investment data.

In the end, while lowering cost is imperative, innovation has a positive human impact on firms – delivering the means to empower employees and win the trust of clients. You can put a price on cost savings, but trust…well that is priceless.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: An update on data standards and global identifiers

Data standards and global identifiers have been parts of capital markets’ practices for many years, and more are being developed, reviewed and shaped as the industry acknowledges their role in streamlining data management, reducing risk, improving transparency, and achieving compliance. This webinar will discuss data standards and identifiers in play, as well as those in...

BLOG

Institutions Look to Cloud to Meet New Asset Management Data Challenges: Arcesium

Institutional asset managers are looking to cloud solutions as they seek to find economies and streamline their asset management processes. Taking advantage of new digital technology that can help them meet changing client demands in an evolving financial landscape, these firms have been prompted to look to the cloud as creaking legacy infrastructure limits their...

EVENT

Buy AND Build: The Future of Capital Markets Technology, London

Buy AND Build: The Future of Capital Markets Technology London on September 19th at Marriott Hotel Canary Wharf London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...