About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

The Composable Enterprise and Desktop Integration

Subscribe to our newsletter

By James Wooster, COO, Glue42.

The starting point for this piece was a statistic buried within a report on the UK Contact Centre market[1]. It stated, on page 138, that across all industries, including Capital Markets, the cost of users navigating themselves within and between applications was estimated to be in excess of £4.3 billion ($5.7 billion) per annum.

That’s a huge cost – and one which was measured purely in terms of time spent performing manual operations within the user’s workflow. Interestingly, this figure did not include the opportunity cost nor the consequential damage of missing the market, losing alpha or errant trades. In short, £4.3B is probably on the low-side, in terms of potential cost savings and improved outcomes, and this, of course, only applies to the UK!

I then came across an article by Dishang Patel, a consultant at Leading Point Financial Markets, in which he discussed the importance of the Composable Enterprise and how this could lead to improved business operations in the front-office. It seems then that the idea of composing, perhaps streamlining applications to support user processes would be a critical part of the solution.

The problem, as many organisations already know, is that the Composable Enterprise concepts are deeply rooted in the data-centre and apply to back-end services and server-side applications. What then for the end-users, the buy-side traders, sell-side traders, portfolio managers, research analysts etc? Who is looking after their needs?

Thankfully, the financial services industry, including the majority of tier-1 banks are beginning to adopt a new approach to solving this challenge. In today’s parlance, this is about the use of Desktop Integration Platforms (DIPs) – to deliver an enterprise-specific application store in which applications can be discovered and dynamically integrated – both at the UI and data level. In many senses, this is an evolution of the ‘interop’ movement that began in the early 2010s – in which the ability to broker data between web applications has been extended to any kind of application (web or otherwise) and where the UI elements are themselves first-class citizens and the preferred means of composability.

A DIP is also a platform upon which new applications are built – with the necessary elements of discoverability, data exchange, UI orchestration and behaviour monitoring are built-in from the outset. This requires a new mindset for CIOs and IT leaders in which some important lessons need to be learned:

1. Ownership: No single application development team (or vendor) owns the entire desktop.

This may sound obvious – but appreciation of this fact will fundamentally change the approach to usage of screen real-estate, data sharing and the modularity/extensibility of the application components themselves.

2. Monoliths: Large, singular apps are only a problem when they can’t be integrated easily to other apps.

A subject of many fierce debates. Sure, legacy apps may be those built with older technology, but that doesn’t mean they are redundant or beyond their shelf-life. If these applications can be decomposed, as Dishang says, into “micro UIs” and then assembled to form new workflows then the cost/benefit argument for their replacement looks very different and doubtful.

3. Agnosticism: Technology stacks will constantly change and therefore an integration framework needs to be programming language agnostic.

Choosing a DIP requires a sense of time and place. The languages popular today, top-most on CVs, will likely lose their ranking within a few short years. All languages must be treated equally, and all integration functions (UI or data) should work, as far as is possible, across them all. On the plus side, this means that enterprises will have the flexibility of using whatever technology they like – comfortable in the knowledge that they can always be integrated.

4. Automation: You can have too much of a good thing. For example, RPA is fine to use for repetitive and stable processes – but it is a recipe for disaster where desktop procedures are ungoverned or where applications constantly change.

In the same report I mentioned above, this time on page 126, they state that almost half of those organisations surveyed had implemented an RPA solution only to remove it later. This is a direct function of integrating too-far. Often, the end user should be allowed to determine their own paths and the integration platform should effectively blur the boundaries between applications such that they all share the same data context and understanding of the process. This is exactly the rationalise for DIP – they re-use existing pieces of UI to allow the user to explore data sources and take decisions unencumbered by artificial process restrictions.

5. Methodology: Applications and their UIs must not be built in isolation. They need to be aware of common (cross-application) services and hence avoid duplicating functionality and confusing the end-user.

This is a lesson that many enterprises learn a little too late – even if they have adopted a Desktop Integration Platform. A DIP must prescribe a design, implementation and in-life methodology to ensure consistent and intuitive workflows.

In summary, the Composable Enterprise, is a worthy and compelling approach. That said, the true benefits of component re-use and interoperability can only be achieved when these principles are applied to front-end and back-end alike.

For those organisations who are looking to improve on earlier integration programmes, they should start to focus on the end-user and their needs – and try and recover some of the $B that are currently being wasted.

[1] ContactBabel – The UK Contact Centre Decision-Makers’ Guide (18th edition – 2020-21)

Subscribe to our newsletter

Related content


Recorded Webinar: Leveraging interoperability: Laying the foundations for unique best-of-breed trading solutions

Interoperability on the trading desk promises more actionable insights, real-time decision making, faster workflows and reduced errors by ensuring data consistency across frequently used applications. But how can these promises be kept in an environment characterised by multiple applications and user interfaces, numerous workflows and technology vendors competing for space on the trader’s desktop? This...


big xyt Partners with Baillie Gifford to Launch Portfolio Liquidity Analysis Solution for Dilution Levy Calculation

big xyt, the independent data and analytics solutions provider, has launched a new tool to automate the process of dilution. The Portfolio Liquidity Analysis solution, developed in collaboration with Baillie Gifford, is designed to enhance buy-side firms’ understanding of equity portfolio liquidity and to address the forthcoming industry guidance on the application of dilution levies....


TradingTech Summit London

Now in its 13th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.


Complex Event Processing

Over the past couple of years, Complex Event Processing has emerged as a hot technology for the financial markets, and its flexibility has been leveraged in applications as diverse as market data cleansing, to algorithmic trading, to compliance monitoring, to risk management. CEP is a solution to many problems, which is one reason why the...