The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Increasing Trade Performance: A Quest for Lower Latency, or Simply Improved Monitoring?

Performance.  We all grow up assessed for it.  But in today’s trading environment, how can organisations balance increased regulatory requirements and maximise the efficiency of their core trading functions? Less operational resources means firms are challenged with creating more sophisticated business logic that differentiates them from the competition in the never ending quest for lower latency. As a result, many firms are trying to understand the relative performance of their electronic trade lifecycle with regards to the business requirements. This has created a huge variety of potential avenues for exploration, largely split into two camps: trading strategy performance and systems performance.

Developing a bespoke system for implementing each individual trader’s specific strategy, knowledge and skills is both expensive and inefficient because there are too many variables. As a result, the value that a trader brings to their desk, in a lot of cases, is in their head, not in a system. This means that organisations are potentially losing out on a few opportunities. Firstly, within a particular desk of traders there will be a variety of skills. Not being able to allocate trades to traders based on the structure of the trade and the trader’s individual skills means that an organisation could be losing performance simply through the random allocation of trades to be executed. Secondly, a trader’s performance is likely to vary as market conditions change. Organisations should use standardised performance tools to implement the traders knowledge into an algo monitoring tool that can keep track of market conditions. They would be able to withdraw from inefficient markets far more effectively than if they are trying to achieve this manually, a situation that gets more complicated with the number of venues involved.

We are beginning to see a few organisations develop algo switches, the ability to switch a range of algorithmic trading models in and out of the market based on predictive analysis of whether the current and near-future market conditions are suitable for each algo. To achieve this, organisations need to develop a level of confidence around the capability of these switches by back-testing the algo monitoring model through historical data, as with algo model development. However, where algo trading model development could be back tested using end-of day, intra-day or conflated tick data, these lower frequencies are inadequate for back testing the monitoring models. Typically this level of data is unavailable in sufficient quantities or not integrated in a single environment.

The quest for faster execution has continued uninterrupted throughout the development of electronic trading strategies. The emergence of High Frequency Trading techniques makes it more important than ever to monitor performance in near real-time – not just of the strategy itself but also the market conditions and underlying systems. Being able to identify an HFT technique that is failing to execute effectively in seconds could save minutes and therefore thousands of potential losing trades. To the same point, it is crucial to spot an increase in market volatility and volume when you don’t have the underlying system resources to support being active in such a market.

Underpinning all this is the performance of the executing systems across the trade lifecycle, which is perhaps both better understood and at the same the most untapped source of benefits. Trade latency analysis has found that some organisations are actually too fast in executing trades. As a result they are missing market opportunities and are no longer aligned with their business strategy. There will always be clients who want to be the fastest one on the block, but currently the trend is moving more towards efficient trade dynamics. This means understanding internal systems’ trade lifecycle and, where possible, systems external to an organisations trade lifecycle. Many organisations that have implemented Transaction Cost Analytics to gain price efficiency across venues in real-time now look to enhance these analytics with the opportunity cost of missed, early or late trades.

As a result, there is a new content set emerging in the process of executing a trade. We are seeing the inclusion of data collected and analysed from internal systems, such as optimal latency, TCA metrics and systems capacity to name a few. And that is in addition to the consumption of reference data, market data, risk margins and limits, client policies, collateral requirements, counter party measures and other traditional factors usually included in trade decisions and execution.

The net result of this is obvious. Organisations can achieve improved trade performance by understanding more about all components involved in the trade lifecycle, from a trader’s knowledge to the available capacity in a server for the execution engine. This, in turn, can likely result in increased customer loyalty and even flow, as well as a more targeted and efficient approach to improving those areas of the trade lifecycle that need the attention most – not just a generic approach to a continued drive for lower latency. The trick is in ensuring your organisation is capturing as much relevant and granular data as possible whilst not relying on bespoke software development for the analysis and implementation of new capabilities. The result: efficiency improves and cost declines.

Related content

WEBINAR

Recorded Webinar: Market data management, licensing and administration in the post-Covid environment

Market data administration has always been a challenge. For many firms, keeping tabs on permissioning and entitlements, compliance with licensing agreements, and reconciling all that with increasingly complex invoices requires a significant dedicated resource with a clear understanding of the issues involved. As if that weren’t enough, things got more challenging for these teams in...

BLOG

Options Partners with Code Willing to Offer New Quantitative Trading Service

Trading infrastructure provider Options Technology has partnered with data management company Code Willing to offer a new quantitative trading service intended to help clients better control and reduce their data analysis costs. The solution adds Code Willing’s data cleansing, organising, and cross-referencing capabilities with Options’ global network, infrastructure and managed services. Aimed at quantitative investment...

EVENT

RegTech Summit APAC

Now in its 2nd year, the RegTech Summit APAC will bring together the regtech ecosystem to explore how capital markets in the APAC region can leverage technology to drive innovation, cut costs and support regulatory change. With more opportunities than ever before for RegTech to add value, now is the time to invest for the future. Join us to hear from leading RegTech practitioners and innovators who will share insights into how they are tackling the challenges of adopting and implementing regtech and how to advance your RegTech strategy.

GUIDE

Regulatory Data Handbook 2021/2022 – Ninth Edition

Welcome to the ninth edition of A-Team Group’s Regulatory Data Handbook, a publication dedicated to helping you gain a full understanding of regulations related to your organisation from the details of requirements to best practice implementation. This edition of the handbook includes a focus on regulations being rolled out to bring order and standardisation to...