About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Opinion: The Evolution of Intelligent Trading

Subscribe to our newsletter

By Mike Powell, Managing Director, Enterprise, Thomson Reuters

I was recently invited to take part in a panel at the Intelligent Trading Summit in London – a fascinating topic exploring the current and future state of trading, industry trends, and supporting technology and architecture.

Alongside Richard Bell, Fixed Income eTrading Latency, Performance & Big Data at BNP Paribas, Alessandro Petroni, Senior Principal Architect for Financial Services at TIBCO Software, and Stuart Grant, EMEA Business Development Manager for Financial Services at SAP, we opened up the day’s event with a wide ranging discussion to set the scene for the rest of the day.

Thomson Reuters sits at the heart of the capital markets industry and we have a unique perspective on the world of trading. Below are some of my personal views I shared with the audience in response to the panel questions:

What do we mean by Intelligent Trading?

A broad question that will mean different things to different people depending on where they sit in the food chain, however, having been involved in many of our enterprise initiatives over recent years I can see several areas where trading has and is still evolving:

Improved liquidity discovery across both lit and unlit venues

Market fragmentation has been a major trend over recent years, initially within the exchange-traded equity markets following the introduction of RegNMS in the US and Mifid in Europe around 2006/2007. The rise of dark pools in tandem with these lit venues has broadened the potential liquidity available to equity trading firms in major markets, and we have seen fragmentation spread to other asset classes such as FX, and more recently Interest Rate Swaps with the introduction of SEFs (Swap Execution Facilities).

This benefits the market by providing increased competition, deeper liquidity and narrower spreads but it presents challenges in capturing a complete picture of liquidity and drives up cost of participation through technology spend in connecting to and trading on multiple venues. The industry will continue to require better and more cost effective tools to enable them to access the broadest view of liquidity.

Improved execution quality and avoidance of price slippage

This has always been an issue for the industry and improved liquidity goes someway to addressing this challenge. However, even in highly liquid markets, the speed and volume of trading means price erosion for large orders can still significantly impact profitability. The industry will continue to look for solutions that provide access to the broadest set of liquidity, evolve algorithms to further minimise market impact, and continue to invest in high performance infrastructure to reduce latency and improve execution efficiency.

Deeper analysis of broader sets of data

Potentially where the story gets more interesting is as the industry starts to solve the challenges of what is often dubbed ‘big data’. The opportunity to access, integrate, mine and find meaning from the broadest set of content, be it structured or unstructured, public or proprietary. Thomson Reuters has driven some interesting initiatives in the machine-readable news space, providing sentiment analysis and relevance scoring across multiple real-time news sources and Internet content.

Automated analysis of other forms of high volume, value-added data offers opportunities to those smart enough to identify patterns, signals and correlations from which they can extract value, be it social media, patent filings, political risk indices, ownership data, or other a myriad of other potential sources.

Managing vast amounts of disparate data, enabling customers to aggregate, normalise, mine, analyse and turn that data into valuable information will be the intelligent trading skills required in future. Our challenge is not only to deliver the broadest and deepest universe of information, but also to provide the intellectual property and tools to enable firms to integrate their own proprietary data.

Cross-trading platforms

Many buy-side firms and hedge funds have diverse investment strategies and the evolution of cross-asset trading platforms has been on the agenda for a few years.

However, three factors are driving a renewed focus in this area; firstly the desire for greater agility given market volatility post 2008 and the global search to find investment return in challenging markets, secondly as markets and asset classes increasingly automate the aspiration of a single platform becomes more realistic, and thirdly as firms look to reduce their operational costs because the price tag associated with running multiple platforms is too high.

Along with the need to aggregate liquidity and improve processing efficiency of trades through the middle and back office, the desire to break down asset-class silos is compounded by the requirement for a real-time, holistic view of risk.

Cost management

Underpinning all of the above is the unrelenting pressure to reduce operational costs. Increased regulatory and reporting obligations are creating significant new costs, so things need to give in other areas. As customers strive to improve their competitiveness, respond to market changes, and enhance their client offering while aggressively managing expenditure to maintain profitability, this forces fresh thinking to solve business challenges. Should firms be considering vendor solutions rather than internal build, review opportunities provided by managed services and cloud technology, or look at vendors as business partners rather than suppliers of product? This appears to be happening in many areas.

Do you agree that it’s no longer simply enough to be fastest, or does that still apply in certain markets or situations?

The latency arms race is not as intense as it once was, not necessarily because speed is no longer important, but more so perhaps because low latency technology such as FPGA has become more mainstream and vendor solutions extremely competitive. In parallel, regulatory changes have focused the sell-side more on agency broking for client order flow rather than proprietary trading on their own books – this has seen somewhat of a split in the latency market between ‘good enough’ for the execution desks on the sell-side, and the genuine HFT focused proprietary trading firms. For the sell-side cost is one of several considerations and latency is not the singular focus. So while no longer an ‘arms race’, performance is still highly relevant but driven by business model choices and a greater understanding of return on investment.

What is the market appetite for intelligent trading architectures today? How do we expect this to change over the next 12-18 months?

Appetite will increase as firms look to adapt to a post credit-crisis world and regain ground lost through frozen IT investment over the last few years. New economic realities will, however, force a rethink of legacy approaches. One example as an opportunity we are discussing with customers is ‘testing as a service’, lifting back-testing, simulation and analysis services into a virtual or cloud environment.

Given the volume of content and the cost of capturing, distributing and storing data, there is considerable interest in whether it is viable to create a shared ‘on-demand’ environment where customers can test applications and algos rather than all build their own. As regulatory pressure increases in certain markets for testing algos and trading engines before moving into production scenarios, we see increasing appetite for managed solutions providing this type of capability.

What tools are available to help in designing, building and implementing an intelligent trading architecture?

As markets automate performance transparency is key – trading infrastructure becomes increasingly critical as the volume and velocity of trades puts greater stress on systems and performance degradation or failure can have significant impact. Over the last few years we have seen the evolution of performance dashboards in support of trading desks – everything from fill rates and latency, to system utilisation and network capacity.

Within our Elektron Managed Service sites we use market leading tools such as ITRS and Gekko Inxite to provide infrastructure performance health-checking and enable proactive monitoring of our hosted customer solutions. Proactive diagnostics drive genuine service improvement, identifying potential issues before they impact customers rather than waiting for them to call the help desk when they experience a problem.

What performance challenges do we face? Is there a trade-off between intelligence and speed/performance?

Yes, and it largely comes down to business model and what a trading firm is trying to achieve. The key trade off is between velocity and volume. The basic principle is that the higher the volume of data and richer the messages, the more challenging to squeeze latency performance.

A genuine HFT firm will most likely leverage proprietary solutions for feed handlers and market gateways, not necessarily because they have better technologists than vendors but because they can achieve higher performance by focusing on the limited data and functionality required to support their business model. They are trying to solve for a specific use case. Where a more sophisticated trading strategy requires broader content, richer message payloads and deeper data sets, then the greater the potential impact on latency.

What will be the next breakthrough in the Intelligent Trading space?

It has to be in the domain of data management – real-time processing and analysis of large volumes of disparate content sets, both structured and unstructured. Whilst financial services firms have vast experience in managing large volumes of high updating data, I believe they are still relatively immature in the big-data arena and could learn a great deal from other industries and academia, ranging from consumer behavioral analysis to processing astrological data points.

Specifically I think there are significant opportunities to be had from the democratisation of unstructured content such as news, research, social media and text analytics. What we call machine-readable news has been around for a while (we entered this market in 2006 and have a well regarded suite of low-latency news and sentiment analysis services for application consumption), but this has been a relatively niche market. I believe this will increasingly become the norm in future as the industry adopts high-volume data analysis tools and techniques from other industries, and the provision of sentiment analysis in Eikon is an important part of this process.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to develop a reporting framework for ESG disclosure regulation

ESG reporting is a challenge and additional burden for many financial institutions as regulations continue to evolve, ESG data management is complex, and global standards remain elusive. Helpful solutions include reporting frameworks that support the collection, understanding, and management of ESG data for disclosure. This webinar will provide practical guidance on how to build a...

BLOG

Regulations for UK Digital Securities Sandbox Come into Force on 8 January 2024

Regulations for the UK’s first Digital Securities Sandbox (DSS) will come into force on 8 January 2024, enabling financial market infrastructures and new entrants to experiment with developing technologies in a more flexible legal and regulatory environment than the existing framework. To date, the government says it has received 19 expressions of interest from financial...

EVENT

Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...