About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Increasing Trade Performance: A Quest for Lower Latency, or Simply Improved Monitoring?

Subscribe to our newsletter

Performance.  We all grow up assessed for it.  But in today’s trading environment, how can organisations balance increased regulatory requirements and maximise the efficiency of their core trading functions? Less operational resources means firms are challenged with creating more sophisticated business logic that differentiates them from the competition in the never ending quest for lower latency. As a result, many firms are trying to understand the relative performance of their electronic trade lifecycle with regards to the business requirements. This has created a huge variety of potential avenues for exploration, largely split into two camps: trading strategy performance and systems performance.

Developing a bespoke system for implementing each individual trader’s specific strategy, knowledge and skills is both expensive and inefficient because there are too many variables. As a result, the value that a trader brings to their desk, in a lot of cases, is in their head, not in a system. This means that organisations are potentially losing out on a few opportunities. Firstly, within a particular desk of traders there will be a variety of skills. Not being able to allocate trades to traders based on the structure of the trade and the trader’s individual skills means that an organisation could be losing performance simply through the random allocation of trades to be executed. Secondly, a trader’s performance is likely to vary as market conditions change. Organisations should use standardised performance tools to implement the traders knowledge into an algo monitoring tool that can keep track of market conditions. They would be able to withdraw from inefficient markets far more effectively than if they are trying to achieve this manually, a situation that gets more complicated with the number of venues involved.

We are beginning to see a few organisations develop algo switches, the ability to switch a range of algorithmic trading models in and out of the market based on predictive analysis of whether the current and near-future market conditions are suitable for each algo. To achieve this, organisations need to develop a level of confidence around the capability of these switches by back-testing the algo monitoring model through historical data, as with algo model development. However, where algo trading model development could be back tested using end-of day, intra-day or conflated tick data, these lower frequencies are inadequate for back testing the monitoring models. Typically this level of data is unavailable in sufficient quantities or not integrated in a single environment.

The quest for faster execution has continued uninterrupted throughout the development of electronic trading strategies. The emergence of High Frequency Trading techniques makes it more important than ever to monitor performance in near real-time – not just of the strategy itself but also the market conditions and underlying systems. Being able to identify an HFT technique that is failing to execute effectively in seconds could save minutes and therefore thousands of potential losing trades. To the same point, it is crucial to spot an increase in market volatility and volume when you don’t have the underlying system resources to support being active in such a market.

Underpinning all this is the performance of the executing systems across the trade lifecycle, which is perhaps both better understood and at the same the most untapped source of benefits. Trade latency analysis has found that some organisations are actually too fast in executing trades. As a result they are missing market opportunities and are no longer aligned with their business strategy. There will always be clients who want to be the fastest one on the block, but currently the trend is moving more towards efficient trade dynamics. This means understanding internal systems’ trade lifecycle and, where possible, systems external to an organisations trade lifecycle. Many organisations that have implemented Transaction Cost Analytics to gain price efficiency across venues in real-time now look to enhance these analytics with the opportunity cost of missed, early or late trades.

As a result, there is a new content set emerging in the process of executing a trade. We are seeing the inclusion of data collected and analysed from internal systems, such as optimal latency, TCA metrics and systems capacity to name a few. And that is in addition to the consumption of reference data, market data, risk margins and limits, client policies, collateral requirements, counter party measures and other traditional factors usually included in trade decisions and execution.

The net result of this is obvious. Organisations can achieve improved trade performance by understanding more about all components involved in the trade lifecycle, from a trader’s knowledge to the available capacity in a server for the execution engine. This, in turn, can likely result in increased customer loyalty and even flow, as well as a more targeted and efficient approach to improving those areas of the trade lifecycle that need the attention most – not just a generic approach to a continued drive for lower latency. The trick is in ensuring your organisation is capturing as much relevant and granular data as possible whilst not relying on bespoke software development for the analysis and implementation of new capabilities. The result: efficiency improves and cost declines.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: New trends and technologies influencing post-trade digitalisation

While digital transformation of front-office functions at financial institutions is well underway, the back office is lagging, calling on firms to reassess and innovate post-trade processes. The need for change is highlighted by specific issues, including the move towards T+1 settlement and increasing regulatory scrutiny of post-trade processes, as well as broader challenges of legacy...

BLOG

Aquis Exchange Teams with BMLL to Provide Liquidity Analytics to Members

Pan-European equity trading exchange operator Aquis Exchange is collaborating with BMLL, a provider of historical Level 3 data and analytics, to deliver liquidity analytics that will provide its members with insights into market structure dynamics. Using BMLL’s granular Level 3 data and analytics, Aquis can now provide its members with third-party, independent verification on the...

EVENT

A-Team Innovation Briefing: Innovation in Cloud

This Innovation Briefing will explore approaches to data infrastructure transformation, technologies required and how to make sure processes are optimised to support real time data management. Hear from leading practitioners and innovative technology solution providers who will share insight into how to set up and leverage your data infrastructure to provide user access to consistent data and analytics, and companies the ability to monetise their data.

GUIDE

Trading Regulations Handbook 2022

Welcome to the third edition of A-Team Group’s Trading Regulations Handbook, a publication designed to help you gain a full understanding of regulations that have an impact on your trading operations, data and technology. The handbook provides details of each regulation and its requirements, as well as ‘at-a-glance’ summaries, regulatory timelines and compliance deadlines, and...