About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

The Composable Enterprise and Desktop Integration

Subscribe to our newsletter

By James Wooster, COO, Glue42.

The starting point for this piece was a statistic buried within a report on the UK Contact Centre market[1]. It stated, on page 138, that across all industries, including Capital Markets, the cost of users navigating themselves within and between applications was estimated to be in excess of £4.3 billion ($5.7 billion) per annum.

That’s a huge cost – and one which was measured purely in terms of time spent performing manual operations within the user’s workflow. Interestingly, this figure did not include the opportunity cost nor the consequential damage of missing the market, losing alpha or errant trades. In short, £4.3B is probably on the low-side, in terms of potential cost savings and improved outcomes, and this, of course, only applies to the UK!

I then came across an article by Dishang Patel, a consultant at Leading Point Financial Markets, in which he discussed the importance of the Composable Enterprise and how this could lead to improved business operations in the front-office. It seems then that the idea of composing, perhaps streamlining applications to support user processes would be a critical part of the solution.

The problem, as many organisations already know, is that the Composable Enterprise concepts are deeply rooted in the data-centre and apply to back-end services and server-side applications. What then for the end-users, the buy-side traders, sell-side traders, portfolio managers, research analysts etc? Who is looking after their needs?

Thankfully, the financial services industry, including the majority of tier-1 banks are beginning to adopt a new approach to solving this challenge. In today’s parlance, this is about the use of Desktop Integration Platforms (DIPs) – to deliver an enterprise-specific application store in which applications can be discovered and dynamically integrated – both at the UI and data level. In many senses, this is an evolution of the ‘interop’ movement that began in the early 2010s – in which the ability to broker data between web applications has been extended to any kind of application (web or otherwise) and where the UI elements are themselves first-class citizens and the preferred means of composability.

A DIP is also a platform upon which new applications are built – with the necessary elements of discoverability, data exchange, UI orchestration and behaviour monitoring are built-in from the outset. This requires a new mindset for CIOs and IT leaders in which some important lessons need to be learned:

1. Ownership: No single application development team (or vendor) owns the entire desktop.

This may sound obvious – but appreciation of this fact will fundamentally change the approach to usage of screen real-estate, data sharing and the modularity/extensibility of the application components themselves.

2. Monoliths: Large, singular apps are only a problem when they can’t be integrated easily to other apps.

A subject of many fierce debates. Sure, legacy apps may be those built with older technology, but that doesn’t mean they are redundant or beyond their shelf-life. If these applications can be decomposed, as Dishang says, into “micro UIs” and then assembled to form new workflows then the cost/benefit argument for their replacement looks very different and doubtful.

3. Agnosticism: Technology stacks will constantly change and therefore an integration framework needs to be programming language agnostic.

Choosing a DIP requires a sense of time and place. The languages popular today, top-most on CVs, will likely lose their ranking within a few short years. All languages must be treated equally, and all integration functions (UI or data) should work, as far as is possible, across them all. On the plus side, this means that enterprises will have the flexibility of using whatever technology they like – comfortable in the knowledge that they can always be integrated.

4. Automation: You can have too much of a good thing. For example, RPA is fine to use for repetitive and stable processes – but it is a recipe for disaster where desktop procedures are ungoverned or where applications constantly change.

In the same report I mentioned above, this time on page 126, they state that almost half of those organisations surveyed had implemented an RPA solution only to remove it later. This is a direct function of integrating too-far. Often, the end user should be allowed to determine their own paths and the integration platform should effectively blur the boundaries between applications such that they all share the same data context and understanding of the process. This is exactly the rationalise for DIP – they re-use existing pieces of UI to allow the user to explore data sources and take decisions unencumbered by artificial process restrictions.

5. Methodology: Applications and their UIs must not be built in isolation. They need to be aware of common (cross-application) services and hence avoid duplicating functionality and confusing the end-user.

This is a lesson that many enterprises learn a little too late – even if they have adopted a Desktop Integration Platform. A DIP must prescribe a design, implementation and in-life methodology to ensure consistent and intuitive workflows.

In summary, the Composable Enterprise, is a worthy and compelling approach. That said, the true benefits of component re-use and interoperability can only be achieved when these principles are applied to front-end and back-end alike.

For those organisations who are looking to improve on earlier integration programmes, they should start to focus on the end-user and their needs – and try and recover some of the $B that are currently being wasted.

[1] ContactBabel – The UK Contact Centre Decision-Makers’ Guide (18th edition – 2020-21)

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlock the Future of European Equities & Trading Technology: 2024 and Beyond!

In a world where the financial landscape is perpetually evolving, 2023 has brought widespread discussions around liquidity, regulatory shifts in the EU and UK, and advancements like the consolidated tape in Europe. For the year ahead in 2024, the European market is poised for transformative changes that will influence the future of trading technology and...

BLOG

Substantive Research White Paper Reveals Disparities and Sharp Increases in Wholesale Market Data Pricing

Substantive Research, a provider of research discovery and research spend analytics for the buy side, has published a White Paper detailing its latest findings on wholesale market data pricing. The report comes in anticipation of the UK’s Financial Conduct Authority’s (FCA) Wholesale Market Data Study, expected by 1 March 2024. The FCA initiated its study...

EVENT

RegTech Summit New York

Now in its 8th year, the RegTech Summit in New York will bring together the regtech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...