About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Increasing Trade Performance: A Quest for Lower Latency, or Simply Improved Monitoring?

Subscribe to our newsletter

Performance.  We all grow up assessed for it.  But in today’s trading environment, how can organisations balance increased regulatory requirements and maximise the efficiency of their core trading functions? Less operational resources means firms are challenged with creating more sophisticated business logic that differentiates them from the competition in the never ending quest for lower latency. As a result, many firms are trying to understand the relative performance of their electronic trade lifecycle with regards to the business requirements. This has created a huge variety of potential avenues for exploration, largely split into two camps: trading strategy performance and systems performance.

Developing a bespoke system for implementing each individual trader’s specific strategy, knowledge and skills is both expensive and inefficient because there are too many variables. As a result, the value that a trader brings to their desk, in a lot of cases, is in their head, not in a system. This means that organisations are potentially losing out on a few opportunities. Firstly, within a particular desk of traders there will be a variety of skills. Not being able to allocate trades to traders based on the structure of the trade and the trader’s individual skills means that an organisation could be losing performance simply through the random allocation of trades to be executed. Secondly, a trader’s performance is likely to vary as market conditions change. Organisations should use standardised performance tools to implement the traders knowledge into an algo monitoring tool that can keep track of market conditions. They would be able to withdraw from inefficient markets far more effectively than if they are trying to achieve this manually, a situation that gets more complicated with the number of venues involved.

We are beginning to see a few organisations develop algo switches, the ability to switch a range of algorithmic trading models in and out of the market based on predictive analysis of whether the current and near-future market conditions are suitable for each algo. To achieve this, organisations need to develop a level of confidence around the capability of these switches by back-testing the algo monitoring model through historical data, as with algo model development. However, where algo trading model development could be back tested using end-of day, intra-day or conflated tick data, these lower frequencies are inadequate for back testing the monitoring models. Typically this level of data is unavailable in sufficient quantities or not integrated in a single environment.

The quest for faster execution has continued uninterrupted throughout the development of electronic trading strategies. The emergence of High Frequency Trading techniques makes it more important than ever to monitor performance in near real-time – not just of the strategy itself but also the market conditions and underlying systems. Being able to identify an HFT technique that is failing to execute effectively in seconds could save minutes and therefore thousands of potential losing trades. To the same point, it is crucial to spot an increase in market volatility and volume when you don’t have the underlying system resources to support being active in such a market.

Underpinning all this is the performance of the executing systems across the trade lifecycle, which is perhaps both better understood and at the same the most untapped source of benefits. Trade latency analysis has found that some organisations are actually too fast in executing trades. As a result they are missing market opportunities and are no longer aligned with their business strategy. There will always be clients who want to be the fastest one on the block, but currently the trend is moving more towards efficient trade dynamics. This means understanding internal systems’ trade lifecycle and, where possible, systems external to an organisations trade lifecycle. Many organisations that have implemented Transaction Cost Analytics to gain price efficiency across venues in real-time now look to enhance these analytics with the opportunity cost of missed, early or late trades.

As a result, there is a new content set emerging in the process of executing a trade. We are seeing the inclusion of data collected and analysed from internal systems, such as optimal latency, TCA metrics and systems capacity to name a few. And that is in addition to the consumption of reference data, market data, risk margins and limits, client policies, collateral requirements, counter party measures and other traditional factors usually included in trade decisions and execution.

The net result of this is obvious. Organisations can achieve improved trade performance by understanding more about all components involved in the trade lifecycle, from a trader’s knowledge to the available capacity in a server for the execution engine. This, in turn, can likely result in increased customer loyalty and even flow, as well as a more targeted and efficient approach to improving those areas of the trade lifecycle that need the attention most – not just a generic approach to a continued drive for lower latency. The trick is in ensuring your organisation is capturing as much relevant and granular data as possible whilst not relying on bespoke software development for the analysis and implementation of new capabilities. The result: efficiency improves and cost declines.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to automate entity data management and due diligence to ensure efficiency, accuracy and compliance

Requesting, gathering, analysing and monitoring customer, vendor and partner entity data is time consuming, and often a tedious manual process. This can slow down customer relationships and expose financial institutions to risk from inaccurate, incomplete or outdated data – but there are solutions to these problems. This webinar will consider the challenges of sourcing and...

BLOG

Delta Capita and Montis Group Enhance Partnership to Develop Advanced Digital Securities CSD Infrastructure

Delta Capita and Montis Group have announced the expansion of their commercial partnership, building on their initial success in developing the Montis Central Securities Depositary (CSD) system. This digitally-native CSD system, powered by Delta Capita’s MACH Distributed Ledger Technology (DLT) platform, is designed to provide the market infrastructure needed for regulated tokenised assets, in the...

EVENT

Buy AND Build: The Future of Capital Markets Technology, London

Buy AND Build: The Future of Capital Markets Technology London on September 19th at Marriott Hotel Canary Wharf London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...