About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Leveraging Interoperability – Laying the Foundations for Unique Best of Breed Solutions

Subscribe to our newsletter

With today’s traders being surrounded by a multitude of applications on their desktops, including order/execution management systems (O/EMSs), various data analytics tools, news feeds, chat/communication platforms, trade blotters and of course the ubiquitous Excel spreadsheets – most of which need to be used simultaneously – the need for seamless communication between those applications has never been greater.

Hence the growing clamour for true interoperability on the trading desk, which promises to not only reduce the risk of errors by ensuring data consistency across frequently-used applications, but also to speed up workflows and assist with real-time decision making. So how can technologies such as smart desktops, low/no code solutions and microservices help create interoperable solutions to satisfy this growing demand?

This was the subject under discussion at a panel session entitled ‘Leveraging interoperability – laying the foundations for unique best of breed solutions,’ held at the A-Team Group’s recent event, Buy AND Build: The Future of Capital Markets Technology London.

Moderated by industry veteran and Founder of consultancy firm Vision57, Steve Grob, the expert panel featured Will Winzor Saile, Partner, Execution Analytics & Architecture at Redburn; Rob Moffat, Senior Technical Architect at FINOS; Reena Raichura, Director, Head of Product Solutions at Interop.io; and Ben Jefferys, Global Head of Technical Pre-Sales at Genesis Global.

The discussion started with quick overviews of FINOS, the FinTech Open Source Software Foundation (part of the Linux Foundation), whose primary goal is to serve as the open source hub for the finance industry, and FDC3, the Financial Desktop Connectivity and Collaboration Consortium, which provides a standard for interoperability to integrate desktop applications and workflows.

The panel agreed that while FDC3 is an excellent standard, it’s merely the initial step that sets the foundation for a solid starting point in a much longer journey towards full interoperability. One panellist pointed out that FDC3’s basic workflow integration and automation are in fact only small components of interoperability. Advanced workflow integration requires seamless interaction between diverse technologies, whether legacy monolithic systems, native applications, platforms like Citrix or even those running Visual Basic 6.0, for example.

Another aspect of interoperability discussed was UI integration, which can be complex, involving the creation of micro apps on an interoperability platform, launching coordinated window layouts, having a centralised notification centre, providing global search capabilities, and much more. From a UI perspective, one panellist felt that the ultimate aim of interoperability should be to provide users with a unified desktop experience where application boundaries are indistinct, ensuring users are able to focus on their tasks rather than the tools.

The idea of ‘straight through workflow,’ drawing inspiration from straight-through processing (STP), was then discussed. Whereas STP originally united the community to address the fragmented backend in the industry, the need for interoperability arises from the fragmented desktop environment. One panellist suggested that this requires a paradigm shift in design thinking, where designing and building workflows should be prioritised over standalone applications, with each application viewed as a component within a broader ecosystem, contributing to the end user’s overall workflow.

Another panellist pointed out that incumbent technology vendors often compete for dominance on the trader’s desktop, as they look to monopolise the trader’s workspace. In the end, most users find some sort of compromise, with one platform for charts and data, another for the core O/EMS and other functions split between them, for example. However, lack of communication between such platforms leads to inefficiencies, meaning that traders have to bridge the gaps and create makeshift workarounds using tools like Excel sheets and macros. A far better solution would be to work with interoperable platforms that not only streamline the workflow but also add significant value by providing traders with comprehensive real-time actionable insights.

In terms of best practices around interoperability, again a significant shift in design thinking was suggested. Rather than having the mindset of being solely a ‘buy shop’ or a ‘build shop’, a more effective approach for banks aiming to innovate rapidly is a blend of buying, building, and integrating. Also, rather than operating in silos focusing only on their specific projects without considering the broader ecosystem, development teams, before developing a solution application, must collaborate and gain an understanding of how users interact with their applications, how they transition between apps, and the overall user journey. Emphasising design thinking, fostering open collaboration among development teams, and establishing robust governance are all essential.

Another recommendation from the panel was to realise that when partnering with a tech provider, it’s not just about ensuring the technology fits the immediate needs. The provider should also demonstrate solid industry expertise to ensure a positive experience. Commitment, partnership, and collaboration are paramount.

One panellist highlighted that an interesting aspect of moving towards interoperability is the plethora of choices available. Previously, decisions were centred on which full-stack vendor to rely on whereas now, it’s about selecting components of that stack and determining how to integrate them. The panellist commented that as an end user, they would prefer having ten vendors who do what they do very well, than one vendor that tries to do everything.

This shift does however mean that firms bear more responsibility and risk for the entire solution. In the past, it was simpler to either outsource everything to a single vendor or build it all in-house. As firms now decouple and consider different interoperable systems for various client types or asset classes, for example, they shoulder more responsibility for ensuring everything functions seamlessly. Despite these risks, there are strong upsides, not least of which is the flexibility to modify components in the future with less trouble. Also, firms can swiftly test new vendors for specific components on the traders’ desktops without entering into long-term commitments. This agility enables rapid integration and evaluation, before scaling up if tests prove successful.

A final thought was shared regarding collaboration. “We’re witnessing a rise in inter-company interoperability,” said one panellist. “Firms spanning both the buy side and sell side are increasingly sharing applications, enriching each other’s desktop experiences, which is really interesting.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Trade the Middle East & North Africa: Connectivity, Data Systems & Processes

In Partnership With Date: 20 May 2024 Time: 11am London / 1pm Egypt & Saudi Arabia / 2pm United Arab Emirates / 6am CET Duration: 50 minutes As key states across the region seek alternatives to the fossil fuel industries that have driven their economies for decades, pioneering financial centres are emerging in Egypt, United...

BLOG

Modernising the Front to Back Trading Workflow: Getting Ready for T+1 SEC Settlement

With less than three months to go before the North American securities markets transition from T+2 settlement to T+1, firms have faced numerous operational challenges in adapting to the shortened settlement cycle, and have been making comprehensive preparations to modify their existing processes to comply. As the deadline approaches, what are some of the key...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Complex Event Processing

Over the past couple of years, Complex Event Processing has emerged as a hot technology for the financial markets, and its flexibility has been leveraged in applications as diverse as market data cleansing, to algorithmic trading, to compliance monitoring, to risk management. CEP is a solution to many problems, which is one reason why the...