About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Capitalising on Data Digitalisation in The Cloud

Subscribe to our newsletter

By Chris Brook, Head of Platform Engineering and Architecture, and Co-Founder of FINBOURNE Technology.

Global capital markets are entering a rapidly shifting phase, with disruption par for the course. Heightened volatility, inflation uncertainty, and geopolitical disruption are dominating our headlines. As a result, organisations across investment management, banking and market infrastructure are balancing on a tightrope of cost optimisation as they continue to deliver value to their clients.

With profit margins under pressure, a tighter fee squeeze and the threat of drawdowns, efficiency is once again firmly in the spotlight and harnessing trusted and timely investment data across positions, portfolio, risk and exposure is proving more critical than ever. However, the cost of operating at this faster pace is challenging to sustain. At this juncture, the industry must find a new way to bridge the gap that exists between current architecture and future state desired.

Consumer technology has made rapid advances, changing the way we live, but the financial services industry has been far slower in its adoption of technology and transformation. While the evolution of the mainframe has served global capital markets for over half a century, it’s important to understand that it was only ever intended to answer specific questions, at a specific point in time.

Today’s challenges

Fast forward to today, we live in a world of big data, including complex and nuanced data sets from areas such as private markets and digital assets. Financial services organisations are struggling to trust or understand their data because the current infrastructure can no longer keep up. This means they are continuing to struggle with data silos, duplication of data and broken processes, making it difficult to promptly respond to market events, to the needs of data-hungry investors, and to increasing scrutiny from regulators.

Understanding performance, attribution and exposure promptly and confidently has also relied on far too many manual workarounds. What organisations are now looking for is the flexibility to innovate and adapt in the long-term and communicate data, the way their clients want, in the short term.

This is also a pressing concern for ESG reporting. While attempts have been made to help standardise this, for example the European ESG Template (EET) and recommendations from the Taskforce on Climate-related Disclosures (TCFD), there is still a lack of overall consensus. This has left teams with the arduous and manual task of data collection for regulatory reporting.

Opportunities

The biggest opportunity for capital markets is to address data at the organisational level, in order to eliminate the operational, reputational and human cost of inadequate data processes. What we have found in our market engagement, is that front-to-back systems, data lakes, warehouses and even meshes only prove value after complicated and time-consuming migrations. Even then, there is often a gap between the technology and business need, because the project has focused on chasing an elusive ‘golden source’ of data, and not how to translate and interpret across disparate data sets, or confidently reproduce data for client and regulatory reporting.

With inherent data problems of trust and timeliness unresolved, the door is open for SaaS technologies to investigate the barriers and break them down. One solution here is a cloud-native, interoperable data store. This opens existing closed systems, rather than adding more into the mix. It is a trusted data fabric that makes sense of the technology investment made to date and fundamentally solves the final piece of the puzzle – understanding and deriving value from the data in your organisation.

Value from data

When we talk to capital markets organisations, the core challenges we hear about are commonly around discrepancy of data, which stem from the multitude of formats and languages used across the financial services ecosystem. Here, organisations need to own and control their data if they are to unlock value from it.

This necessitates a data foundation that can deliver safer, better and faster data, and will secure the insights needed to empower teams, boost performance and add value to clients. Achieving this, through safe operational change, is what will move organisations from operating at cost, to driving value and operating at profit margins once again.

However, the reality to date is that capital markets options have been largely limited to best of breed solutions, outsourcing, or big bang, front-to-back, multi-year migration projects. Over the years, this cycle has led to severe infrastructure complexity and in many cases, a single vendor dependency that is now constraining competitive growth and innovation. These traditional models have also contributed to a misconception that operational change is more costly than value generating.

Interestingly, what organisations don’t readily recognise, is that the rigid data models mandated by single platform vendors often compound the issues they face across a number of workflows, from performance, attribution and exposure calculation to accounting and reporting.

Reaching the future-state operating model desired is absolutely what capital markets should be thinking about, but organisations should do it their way, in the language and format they choose to use. This means migrating between data models, so organisations have the optionality to understand what they own and how much it is worth, at any given point in time – the way they want to see it.

This has significant implications on mission critical workflows, from the historical analysis of ESG data to creating proprietary insights, to eliminating inefficiencies, so organisations can interpret and speak the same language as the entities in their ecosystem. It also removes the need to make manual and daily adjustments to confidently strike the same shadow NAV as your custodian.

Innovation

The outcomes we hear from market engagement, whether its greater granularity and transparency across public and private assets, regaining operational control to reduce cost and risk, or even liberating employees so their talents and skills can be used to support business growth, all start with getting the data right.

However, tackling the fundamental problem of understanding, accessing and controlling the data requires big bang revolutionary thinking to stop. Instead, we need to start taking an evolutionary approach that is empathetic to the fact that organisations are not operating from a blank slate.

A SaaS-native path to technology innovation enables organisations to safely focus on the core of their business, rather than the distraction of a multi-year migration. It means starting small with mission critical areas, realising the value quickly, and building the business case for further change. It’s operational change on your terms, and it will drive the process of improving efficiency and transparency in capital markets.

This is not about creating another monolithic data store, or traditional constructs such as front-to-back change and buy versus build. By opening up closed systems, the complexity within the existing operational estate can be addressed in line with the industry’s low-risk appetite. Effectively, a SaaS-native and interoperable approach will enable the industry to adapt to evolving markets by building a bridge to external innovation and emerging technologies, and safely control the decommissioning and upgrading of technology.

Practicalities

The echoing themes around multiple sources of truth, the difficulty of achieving high-quality data, and how to interpret and gain value from complex data sets, resonate strongly with our mission to transition global financial services out of the past and make it future-fit. To do this in the quickest possible time, and without ripping it all out and starting again, there are some first steps organisations need to take to become change ready.

First, leveraging clean interoperable data should be a business priority. To address data-led change and make the transition from cost centre to value generation, operations leaders and CDOs need to start asking the uncomfortable questions around the costs they bear and the capabilities they are lacking, not at the operational level, but at the business level and with senior leadership buy-in.

While dealing with technology decisions, there are also humans behind those decisions. Inaction is one of the biggest issues we face when it comes to operational change. We recently asked participants at an industry workshop what kind of risk appetite they had for operational change. Only 12% were willing to take high-risk change. As providers, we need to address this low appetite for risk if we are to successfully move the industry on from mainframe inefficiency to become future-ready.

This is why a SaaS, interoperable data store that can de-risk the process of change and empower organisations with instantaneous access to trusted data, will break through the traditional cycle and future-proof the industry – reducing operational, reputational and human cost. Critically, by improving access, control and understanding of data, the industry stands the best chance at freeing its talent from burn out. Ultimately, regaining motivation and moving the focus on to high value tasks is what will generate the most value for clients and drive the business forward.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: The roles of cloud and managed services in optimising enterprise data management

Date: 14 May 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Cloud and managed services go hand-in-hand when it comes to modern enterprise data management (EDM), but how can the best solutions for the business be selected and implemented to ensure optimal data management based on accurate, consistent, and high-quality...

BLOG

SIX Invests in BITA to Enhance Global Benchmark Platform and Expand Indexing Technology Services

SIX, the Swiss financial data and market infrastructure provider, has made a strategic investment in BITA, the indexing technology and services company. The move aims to bolster a range of ongoing joint projects and fast-track the global expansion of SIX’s benchmark platform. Existing BITA shareholders, including ETFS Capital, Volta Ventures, and Pamica NV, have also...

EVENT

Buy AND Build: The Future of Capital Markets Technology, London

Buy AND Build: The Future of Capital Markets Technology London on September 19th at Marriott Hotel Canary Wharf London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...