The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

The Composable Enterprise and Desktop Integration

By James Wooster, COO, Glue42.

The starting point for this piece was a statistic buried within a report on the UK Contact Centre market[1]. It stated, on page 138, that across all industries, including Capital Markets, the cost of users navigating themselves within and between applications was estimated to be in excess of £4.3 billion ($5.7 billion) per annum.

That’s a huge cost – and one which was measured purely in terms of time spent performing manual operations within the user’s workflow. Interestingly, this figure did not include the opportunity cost nor the consequential damage of missing the market, losing alpha or errant trades. In short, £4.3B is probably on the low-side, in terms of potential cost savings and improved outcomes, and this, of course, only applies to the UK!

I then came across an article by Dishang Patel, a consultant at Leading Point Financial Markets, in which he discussed the importance of the Composable Enterprise and how this could lead to improved business operations in the front-office. It seems then that the idea of composing, perhaps streamlining applications to support user processes would be a critical part of the solution.

The problem, as many organisations already know, is that the Composable Enterprise concepts are deeply rooted in the data-centre and apply to back-end services and server-side applications. What then for the end-users, the buy-side traders, sell-side traders, portfolio managers, research analysts etc? Who is looking after their needs?

Thankfully, the financial services industry, including the majority of tier-1 banks are beginning to adopt a new approach to solving this challenge. In today’s parlance, this is about the use of Desktop Integration Platforms (DIPs) – to deliver an enterprise-specific application store in which applications can be discovered and dynamically integrated – both at the UI and data level. In many senses, this is an evolution of the ‘interop’ movement that began in the early 2010s – in which the ability to broker data between web applications has been extended to any kind of application (web or otherwise) and where the UI elements are themselves first-class citizens and the preferred means of composability.

A DIP is also a platform upon which new applications are built – with the necessary elements of discoverability, data exchange, UI orchestration and behaviour monitoring are built-in from the outset. This requires a new mindset for CIOs and IT leaders in which some important lessons need to be learned:

1. Ownership: No single application development team (or vendor) owns the entire desktop.

This may sound obvious – but appreciation of this fact will fundamentally change the approach to usage of screen real-estate, data sharing and the modularity/extensibility of the application components themselves.

2. Monoliths: Large, singular apps are only a problem when they can’t be integrated easily to other apps.

A subject of many fierce debates. Sure, legacy apps may be those built with older technology, but that doesn’t mean they are redundant or beyond their shelf-life. If these applications can be decomposed, as Dishang says, into “micro UIs” and then assembled to form new workflows then the cost/benefit argument for their replacement looks very different and doubtful.

3. Agnosticism: Technology stacks will constantly change and therefore an integration framework needs to be programming language agnostic.

Choosing a DIP requires a sense of time and place. The languages popular today, top-most on CVs, will likely lose their ranking within a few short years. All languages must be treated equally, and all integration functions (UI or data) should work, as far as is possible, across them all. On the plus side, this means that enterprises will have the flexibility of using whatever technology they like – comfortable in the knowledge that they can always be integrated.

4. Automation: You can have too much of a good thing. For example, RPA is fine to use for repetitive and stable processes – but it is a recipe for disaster where desktop procedures are ungoverned or where applications constantly change.

In the same report I mentioned above, this time on page 126, they state that almost half of those organisations surveyed had implemented an RPA solution only to remove it later. This is a direct function of integrating too-far. Often, the end user should be allowed to determine their own paths and the integration platform should effectively blur the boundaries between applications such that they all share the same data context and understanding of the process. This is exactly the rationalise for DIP – they re-use existing pieces of UI to allow the user to explore data sources and take decisions unencumbered by artificial process restrictions.

5. Methodology: Applications and their UIs must not be built in isolation. They need to be aware of common (cross-application) services and hence avoid duplicating functionality and confusing the end-user.

This is a lesson that many enterprises learn a little too late – even if they have adopted a Desktop Integration Platform. A DIP must prescribe a design, implementation and in-life methodology to ensure consistent and intuitive workflows.

In summary, the Composable Enterprise, is a worthy and compelling approach. That said, the true benefits of component re-use and interoperability can only be achieved when these principles are applied to front-end and back-end alike.

For those organisations who are looking to improve on earlier integration programmes, they should start to focus on the end-user and their needs – and try and recover some of the $B that are currently being wasted.

[1] ContactBabel – The UK Contact Centre Decision-Makers’ Guide (18th edition – 2020-21)

Related content


Recorded Webinar: The evolution of market surveillance across sell-side and buy-side firms

Market surveillance is crucial, and in many cases a regulatory requirement, to ensuring orderly securities markets and sustaining confidence in trading. It can be breached and has become increasingly complex in the wake of the Covid pandemic, Brexit, and the emergence of new asset classes. This webinar will review the extent of market abuse in...


Cloud, Trading Infrastructure Modernisation, Digital Transformation Are Hot Topics at This Year’s TradingTech Summit Virtual

The three key themes of Cloud, Trading Infrastructure and Digital Transformation underpinned A-Team Group’s 2021 TradingTech Summit Virtual earlier this week, which brought together speakers and panellists from across the financial markets technology sector. Irina Sonich-Bright, Managing Director at Credit Suisse, in conversation with A-Team’s Andrew Delaney, started proceedings by giving an upbeat outlook on...


ESG Data & Tech Summit 2022

The inaugural ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.


Trading Regulations Handbook 2021

In these unprecedented times, a carefully crafted trading infrastructure is crucial for capital markets participants. Yet, the impact of trading regulations on infrastructure can be difficult to manage. The Trading Regulations Handbook 2021 can help. It provides all the essentials you need to know about regulations impacting trading operations, data and technology. A-Team Group’s Trading...