About a-team Marketing Services

A-Team Insight Blogs

Leveraging Technology to Modernise Trading Platforms

Subscribe to our newsletter

At the A-Team Group’s recent TradingTech Briefing New York, the opening panel session was on the subject of “Modernising trading platforms”, where the discussion, moderated by Andrew Delaney, A-Team Group’s President & Chief Content Officer, centred around how to leverage interoperability, technology and outsourcing to differentiate and create a best-in-class trading platform.

Panelists Nikhil Singhvi, Head of Trade & Collateral Management, Regulatory & Controls Technology, at Credit Suisse, Kevin Sweeney, Head of Product Management Strategy, Planning & Operations, FCAT at Fidelity Investments, Mike Powell, CEO of Rapid Addition, and Paula Clarke, Presales Engineer at KX, engaged in a wide-ranging and insightful conversation, covering topics such as integrating modern, cloud-based systems with legacy technologies, the build-versus-buy conundrum, managing and controlling the cost of modernisation, and much more.

The optimal trading platform?

The discussion started with the question, ‘If money was no object, what would the optimal trading platform look like?’

Speakers suggested that there are always constraints such as time, resources, and money. So the focus should be on building flexibility and reducing technology debt accrued over years of business evolution. Cloud technology can play a pivotal role in tackling outdated systems and enhancing efficiency, but its adoption also comes with significant costs and time investment, particularly when migrating old systems that require substantial computational power and storage.

Further, it’s essential to recognise that not everyone shares the same level of understanding or perception of technology. While some might tout the innovative, flexible, and secure aspects of cloud technology, others prioritise strategies that significantly mitigate the aforementioned technology debt. One panelist stressed that we need to acknowledge and plan for unexpected circumstances that can alter our approach to technology, such as the recent COVID pandemic. But all were agreed that the ongoing aim should be to manage and reduce tech debt to preserve flexibility, rather than aiming for a perfect but elusive solution.

Integrating with legacy architecture

Panelists were then asked how firms should approach the integration of cloud-based open architectures with complex IT infrastructures and on-premise legacy systems.

Several key steps were recommended by the panel. Firstly, clear objectives need to be established, with quantifiable goals. Institutions must understand the operational impact on the organisation, including changes to the management of existing on-site infrastructure. Secondly, stakeholder buy-in should be secured. During the transition, to enable a smooth integration, data standards should be defined for existing solutions, and an API-first approach taken. While managing two separate systems, tools for monitoring and visibility are essential. Firms should also consider modernising existing systems using strategies like containerisation, or adoption of service-based architectures.

The panel recognised that running dual infrastructures can be challenging, often leading to initial inefficiency and staffing issues due to diverse skill requirements. These complexities need to be communicated, with the understanding that such short-term difficulties pave the way for long-term gains. One panelist suggested that a ‘crawl-walk-run’ approach should be taken when adopting cloud technology, gradually progressing and accelerating the journey once the first steps are taken.

What are the biggest challenges?

An audience poll asked, ‘What are your biggest challenges when modernising your trading platform and integrating new technologies with legacy systems?’

47% of respondents cited getting budget and buy-in for legacy modernisation, with 44% also stating that migrating systems efficiently without causing disruption to the business was another big challenge.

Reflecting on these results, one panelist commented that after the 2008 crisis, firms were primarily focused on meeting regulatory requirements, leaving little room for innovation. Nowadays, with costs streamlined and unprofitable ventures exited, firms are once again seeking competitive advantages in the market. Thus, they need to create compelling business cases for investments in differentiating technologies. The key question is whether emerging technologies such as cloud, low-code solutions, or open platforms can quickly deliver operational enhancements and unique business advantages.

One speaker suggested that maintaining operations while dismantling legacy systems and creating new infrastructure has proven more disruptive and complex than initially anticipated at many firms.

In the past five to six years, although significant investments have been made in cloud migration, merely ‘lifting and shifting’ systems without developing new applications could lead to increased technology costs rather than immediate savings. Today’s challenges often lie in justifying the business case for cloud migration and finding ways to offset these costs, particularly as capital costs rise.

Build, buy or outsource?

The discussion then turned to the build versus buy versus outsource debate. Panelists generally agreed that the correct philosophy is to outsource generic utilities while retaining and developing unique intellectual property in-house. Although a simplistic concept in theory, this approach can present challenges in practice due to individual vendor differences. An important pitfall to avoid is the temptation to enhance non-differentiating elements internally, with one speaker pointing out the ‘ego danger’ of firms believing they can build things better internally than a vendor can. Instead, internal resources should be allocated towards innovation, capitalising on recent technological developments in the API-driven, open, low-code solutions sphere.

The trade-off between in-house development and using a vendor often depends on cost-effectiveness. It’s essential therefore to understand the incremental value and to remain vigilant about cost structures, particularly regarding the total cost of ownership, including maintenance, support and third-party technology licence upgrades.

A true partnership approach between vendors and firms is crucial. Firms need to ensure they understand how the vendor operates, whether in terms of outsourced development, server hosting or cloud services. One panelist sounded a warning note that while partnerships and outsourcing deals generally work well, there have been some rare incidents underscoring the need for continued vigilance on the part of firms.

Managing costs

Panelists were then asked how financial institutions can manage the cost of technologies as they age, and apply lessons learned from previous experiences to new areas of automation?

One speaker responded that building systems in modular components can prove highly beneficial, as it allows for the same elements to be reused in different areas, thereby adding value. Also, adopting industry-standard practices in data and messaging facilitates easier integration of both internal and external systems, with APIs serving as a critical communication link between different software components.

Transitioning to the cloud requires careful selection of technology products tailored to specific purposes, such as data storage or system acceleration. Adherence to industry standards builds skills and expertise, which in turn contributes to a company’s success. The resultant system is not only cost-effective but also easier to maintain.

Containerisation, or the encapsulation of code to enable consistent execution across varying computing environments, is also advantageous. One panelist gave the example of an infrequently used disaster recovery site, which could be containerised and run as a “cold” or inactive site, thereby saving resources while allowing for technological improvements over time.

Securing funding for modernisation

The speakers then shared their insights on how to ‘sell’ the business value related to tech modernisation to business leaders, and how to convince senior management to invest at a realistic level.

It was agreed that securing funding for tech projects is a complex task, largely driven by business decisions, not solely technological merits. As such, it’s crucial to highlight the differentiating factors, regulatory response strategies, and real business benefits of any proposed modernisation initiative. Technologists should engage stakeholders gradually, debunking the misconception that business-focused individuals can’t comprehend technical aspects. In sectors like capital markets, these stakeholders are often equipped to handle complex tasks and, with the right guidance, can grasp the key technical concepts.

Amid rapidly emerging technologies, it’s important to communicate benefits clearly, and to be able to visually demonstrate tech solutions. A proof of concept, even if not perfect, can effectively showcase a technology’s potential benefits. Panelists recommended that when evaluating vendors, the starting point should be understanding the problem at hand and what the vendor offers – whether it’s a technological solution, domain expertise, or a service based on past experience – before delving into aspects like pricing and complexity.

Conclusion

The session concluded with some closing thoughts from each of the panelists. The advent of cloud technology has accelerated vendor evaluations, facilitating faster onboarding and testing of new vendor solutions. Cloud technology has levelled the playing field in vendor selection, enabling smaller start-ups to compete with traditional vendors.

Speakers agreed that this is an exciting era for technology in financial markets, shifting from a product-centric model to an agile, interoperable, and modular paradigm. Correspondingly, vendor relationships have transformed from simple client-vendor interactions to collaborative partnerships, leading to greater client influence over vendors’ future plans and product roadmaps. These partnerships not only foster healthier and more constructive discussions, but thanks to the flexibility of modern technology, the industry is seeing a more strategic and cooperative approach to technology investment.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Enhancing Buy-Side Trading Efficiency: Navigating Interoperability and AI in Real Workflows

Enhancing Buy-Side Trading Efficiency: Navigating Interoperability and AI in Real Workflows Emerging capabilities in AI and interoperability are transforming trading workflows, with the promise of heightened levels of collaboration and personalisation resulting in greater efficiency and performance. The potential of these new technologies is encouraging financial firms to modernise their trader desktops and streamline operational...

BLOG

Driving Trading Innovation and Digitisation with Interoperability And Low Code/No Code Technologies

Interoperability and low code/no code technologies on the trading desk are increasingly recognised as modern approaches that contribute to building successful, agile trading platforms. So how can firms implement these technologies effectively to enhance the user experience and reap some real business benefits? And how do they foster a culture that supports the necessary innovation...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Data Lineage Handbook

Data lineage has become a critical concern for data managers in capital markets as it is key to both regulatory compliance and business opportunity. The regulatory requirement for data lineage kicked in with BCBS 239 in 2016 and has since been extended to many other regulations that oblige firms to provide transparency and a data...