About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

The Composable Enterprise and Desktop Integration

Subscribe to our newsletter

By James Wooster, COO, Glue42.

The starting point for this piece was a statistic buried within a report on the UK Contact Centre market[1]. It stated, on page 138, that across all industries, including Capital Markets, the cost of users navigating themselves within and between applications was estimated to be in excess of £4.3 billion ($5.7 billion) per annum.

That’s a huge cost – and one which was measured purely in terms of time spent performing manual operations within the user’s workflow. Interestingly, this figure did not include the opportunity cost nor the consequential damage of missing the market, losing alpha or errant trades. In short, £4.3B is probably on the low-side, in terms of potential cost savings and improved outcomes, and this, of course, only applies to the UK!

I then came across an article by Dishang Patel, a consultant at Leading Point Financial Markets, in which he discussed the importance of the Composable Enterprise and how this could lead to improved business operations in the front-office. It seems then that the idea of composing, perhaps streamlining applications to support user processes would be a critical part of the solution.

The problem, as many organisations already know, is that the Composable Enterprise concepts are deeply rooted in the data-centre and apply to back-end services and server-side applications. What then for the end-users, the buy-side traders, sell-side traders, portfolio managers, research analysts etc? Who is looking after their needs?

Thankfully, the financial services industry, including the majority of tier-1 banks are beginning to adopt a new approach to solving this challenge. In today’s parlance, this is about the use of Desktop Integration Platforms (DIPs) – to deliver an enterprise-specific application store in which applications can be discovered and dynamically integrated – both at the UI and data level. In many senses, this is an evolution of the ‘interop’ movement that began in the early 2010s – in which the ability to broker data between web applications has been extended to any kind of application (web or otherwise) and where the UI elements are themselves first-class citizens and the preferred means of composability.

A DIP is also a platform upon which new applications are built – with the necessary elements of discoverability, data exchange, UI orchestration and behaviour monitoring are built-in from the outset. This requires a new mindset for CIOs and IT leaders in which some important lessons need to be learned:

1. Ownership: No single application development team (or vendor) owns the entire desktop.

This may sound obvious – but appreciation of this fact will fundamentally change the approach to usage of screen real-estate, data sharing and the modularity/extensibility of the application components themselves.

2. Monoliths: Large, singular apps are only a problem when they can’t be integrated easily to other apps.

A subject of many fierce debates. Sure, legacy apps may be those built with older technology, but that doesn’t mean they are redundant or beyond their shelf-life. If these applications can be decomposed, as Dishang says, into “micro UIs” and then assembled to form new workflows then the cost/benefit argument for their replacement looks very different and doubtful.

3. Agnosticism: Technology stacks will constantly change and therefore an integration framework needs to be programming language agnostic.

Choosing a DIP requires a sense of time and place. The languages popular today, top-most on CVs, will likely lose their ranking within a few short years. All languages must be treated equally, and all integration functions (UI or data) should work, as far as is possible, across them all. On the plus side, this means that enterprises will have the flexibility of using whatever technology they like – comfortable in the knowledge that they can always be integrated.

4. Automation: You can have too much of a good thing. For example, RPA is fine to use for repetitive and stable processes – but it is a recipe for disaster where desktop procedures are ungoverned or where applications constantly change.

In the same report I mentioned above, this time on page 126, they state that almost half of those organisations surveyed had implemented an RPA solution only to remove it later. This is a direct function of integrating too-far. Often, the end user should be allowed to determine their own paths and the integration platform should effectively blur the boundaries between applications such that they all share the same data context and understanding of the process. This is exactly the rationalise for DIP – they re-use existing pieces of UI to allow the user to explore data sources and take decisions unencumbered by artificial process restrictions.

5. Methodology: Applications and their UIs must not be built in isolation. They need to be aware of common (cross-application) services and hence avoid duplicating functionality and confusing the end-user.

This is a lesson that many enterprises learn a little too late – even if they have adopted a Desktop Integration Platform. A DIP must prescribe a design, implementation and in-life methodology to ensure consistent and intuitive workflows.

In summary, the Composable Enterprise, is a worthy and compelling approach. That said, the true benefits of component re-use and interoperability can only be achieved when these principles are applied to front-end and back-end alike.

For those organisations who are looking to improve on earlier integration programmes, they should start to focus on the end-user and their needs – and try and recover some of the $B that are currently being wasted.

[1] ContactBabel – The UK Contact Centre Decision-Makers’ Guide (18th edition – 2020-21)

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Data platform modernisation: Best practice approaches for unifying data, real time data and automated processing

Date: 17 March 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Financial institutions are evolving their data platform modernisation programmes, moving beyond data-for-cloud capabilities and increasingly towards artificial intelligence-readiness. This has shifted the data management focus in the direction of data unification, real-time delivery and automated governance. The drivers of...

BLOG

TNS Completes Acquisition of BT Radianz, Cementing Shift in Financial Markets Connectivity

Transaction Network Services has completed its acquisition of BT Radianz, formally bringing the long-established financial markets network under the ownership of TNS and closing a deal first announced in September. Radianz, which for more than two decades has provided secure, managed connectivity between trading firms, exchanges, market data venues and service providers, now sits within...

EVENT

AI in Capital Markets Summit London

Now in its 3rd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

Enterprise Data Management, 2010 Edition

The global regulatory community has become increasingly aware of the data management challenge within financial institutions, as it struggles with its own challenge of better tracking systemic risk across financial markets. The US regulator in particular is seemingly keen to kick off a standardisation process and also wants the regulatory community to begin collecting additional...