About a-team Marketing Services

A-Team Insight Blogs

Automation Failure in Capital Markets – Why We Need to Talk About Data Management

Subscribe to our newsletter

By James Maxfield, Head of Product at Duco.

‘I don’t understand what all these people do?’ is a common question in capital markets as budgets are set for the next cycle and the C-team (EO,FO,OO) queries its cost base. This creates a cycle of justification from process owners – ops, risk, finance – who usually explain it away as regulatory compliance, volume increases, system limitations, or new products – and often a combination of all of the above.

Typically, the C-team has tired of technology led transformation programmes, scarred by failed system replatforming or decommissioning programmes that did not replicate the functionality required to support the business. Even where technology was implemented successfully, it often failed to deliver the outcomes to justify the investment spend and often led to increases (not decreases) in the number of people required to run mission critical processes.

As a result, the game that is played out becomes a war of attrition, relying on either side to make a mistake in their positioning or become too fatigued by the argument that they concede. Budgets are set, then everyone moves on to figure out how they meet the obligations they have signed up to – the C-team engaging in a game of whack-a-mole to find their cost cutting prize somewhere else, while the process owners protect, or in many cases increase the size, of their own turf.

The pandemic provided some distraction from this annual dance, with market volatility providing a welcome boost to revenues and allowing leadership to consider growth agendas for the first time in almost a decade. However, this didn’t remove the underlying issues that have persisted for some time, namely, that the cost base for many was overweight for the size of their business.

Analysis by Eurogroup Consulting[1] highlights that for its sample of 16 global corporate and investment banks (CIB’s), their cost bases grew on average by 5.5% over the past two years. And with the economic outlook, and recent earnings announcements, making grim reading, leadership across the sector will need to be sharpening its pencil to look for ways to mitigate the downside risk to its franchises.

But is sustainable cost reduction across middle and back office a futile ambition? Where good just looks like being less bad than everyone else? No is the answer, but it does need a different approach, and that is recognising that chronically low levels of automation are not actually a technology issue, but a data problem.

This issue manifests itself visibly in the number of people deployed in front, middle and back office functions across the industry. Some of these are performing value adding activities – servicing customers, managing risk, building new products – but the majority are plugging gaps in the processing infrastructure. How do we know this? Take a look at the high level numbers – 20,000 to 30,000 operations staff in a large global custodian is the norm, 8,000 to 10,000 operations staff in the average tier 1 or 2 CIB, and 3,000 to 5,000 staff is not uncommon across the entire middle and back office of a regional player. These people are not all adding value, they are managing shared inboxes, stop-gapping processes, and playing a critical role in managing the trade lifecycle by enriching and updating data throughout the process. Essentially, they are performing the role of the ‘human API’ in the process.

The human API is highly flexible, doesn’t require defined procedures, and is excellent at interpreting how different data models need to be merged to flow information through the infrastructure. It takes unstructured data out of emails, translates it and puts it where it needs to go. Human APIs are also highly skilled at extracting data from systems, applying subject matter expertise to it, and then translating it into something sensible that is described as management information (MI) to be provided to senior management.

Human APIs are hard to automate out of the process, as they are extremely good at compensating for gaps in data or responding to bespoke requests, but over time, this reliance can become counter-productive. Not just in terms of scalability and resilience of the organisation as a whole, but also in terms of talent management and retention. The gloss of working for a well-known financial brand soon wears off when the daily grind is moving data from email into a spreadsheet and back again.

The Human API is a Data Management Problem

BCBS239 pushed a wave of good ideas around data management into the industry, but the reaction of most was to treat it like any other compliance requirement – meet it quickly and move on. The principles brought focus to the concept of data ownership (accountability for integrity), distribution (how data flows), and consumption (how data is used). Unfortunately, adoption of the principles varied wildly with the CDO often lacking a mandate for enforcement, leaving a lot of value on the table.

Legacy behaviours, often driven by a lack of trust in internal data, persisted bad practice, such as inconsistent data models, duplicate sources of the same data, and polluted data lakes, and limited automation opportunities. But the fundamental principles underlying the regulation made a lot of sense and still hold the key to unlocking a lot of the automation that the industry so desperately needs.

Here are a few thoughts on how to use data frameworks to drive your operating model transformation forward:

Focus on data that drives the process: As outlined earlier, a significant proportion of the cost associated with human APIs is purely data driven. Redefining the operating model for, let’s say, margin call management that is based on the data that needs to drive the process will bring a very different outcome versus simply buying a new margin call platform. This will crystallise what data is required (trade portfolios, CSA data), where it resides (front office systems), and who is responsible for its integrity (front office? middle office?). Once this discussion is opened up, process owners move away from what they do now, such as manipulate data or load trades, and think about what data they need to optimise their operating models, not what technology they need to automate their problem.

Think front to back: Very few processes are contained entirely within one function, meaning processes or customer journeys must be thought about from front to back. While requiring focus behind a common goal and mobilisation across different teams, a data driven approach can very quickly identify challenges and opportunities that traditional automation agendas miss.

This holistic focus on data is a different way of thinking about solving what look like bad processes, but can prove to be insightful when requirements are brought together. A common outcome is that different user groups – risk, finance, operations – have the same data driven requirements but satisfy them using their own internal solutions, not only adding cost and reducing business agility, but also creating integrity checks around duplicate sources of the same data.

Let technology enable the solution, it is not the solution in itself: Many organisations have failed to solve manually intensive processes by ignoring the problem and simply buying technology (‘we have the licences now, please find problems to solve’). But by being clear around the data required to drive the target operating model, the ownership model around the data to enable it to be consumed and trusted, and the way it is distributed throughout the process, you can often greatly simplify the solution required. This data-centric approach can often open up opportunities for a broader set of solution providers, where third parties or managed service providers can be deployed on or around the required data sets.

Be practical, pragmatic and outcome focused: Success with shifting mindsets to deliver a data-centric approach to problem solving needs pragmatism and success stories to gain momentum. Tying business outcomes, such as clean, verified data on T+1 for OTC portfolios, challenges traditional, application centric approaches to innovation that reside in many organisations. Keeping a narrow focus on specific problems linked to business outcomes is key to being able to create momentum.

By adopting this data centric approach to target operating model design, organisations will ultimately value data, rather than applications, as the cornerstone of their transformation agendas. Applications and services will come and go, but data will be the only constant that the organisation will design operating models around. Those that can do this successfully will gain a significant competitive advantage against their peers, as they will unlock the capacity associated with the significant volumes of human APIs across the processing stack. Whether this is redeployed for value enhancement or cost reduction, it will ultimately unlock opportunity for the whole C-team.

[1] EUROPEAN CORPORATE & INVESTMENT BANKING OUTLOOK 2022, Eurogroup Consulting

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: ESG data sourcing and management to meet your ESG strategy, objectives and timeline

Date: 11 June 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes ESG data plays a key role in research, fund product development, fund selection, asset selection, performance tracking, and client and regulatory reporting, yet it is not always easy to source and manage in a complete, transparent and timely manner....

BLOG

FactSet Introduces Interactive GenAI Solution Transcript Assistant

FactSet has released its first interactive GenAI solution available in the FactSet Workstation. Called Transcript Assistant, the solution is a conversational chatbot designed to accelerate in-depth research and analysis of earnings call transcripts, and help users search, analyse, and extract valuable, actionable insights from all transcripts in FactSet with a view to improving the investment...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...