The rapid evolution of artificial intelligence (AI) and machine learning (ML) technologies has the potential to revolutionise the way financial markets operate, not least by providing new opportunities for enhancing trading analytics and research. So how can firms effectively integrate AI into their trading workflows? What are the challenges and opportunities associated with leveraging data for alpha generation, and how has the emergence of generative AI reshaped the landscape?
These questions were amongst those covered in a lively and thought-provoking panel session at the A-Team Group’s recent London event, Buy AND Build: The Future of Capital Markets Technology. The discussion, entitled ‘How to successfully integrate and deploy AI in trading analytics and research’ was moderated by Nicola Hedley, Co-Founder of Dark Spark Consulting, and featured Matthew Hertz, Head of ML Technology at Man Group, Adam Ragol-Levy, Global Multi-Asset Product Manager at RBC Capital Markets, Joe Everitt, Managing Director, Electronic Trading at Stifel Financial, and Dr. Elliot Banks, Chief Product Officer at BMLL.Trends in Trading Analytics and AI Deployment
The session commenced with one panellist pointing out that trading analytics has evolved into a billion-dollar industry and is expected to grow significantly by 2027. The integration of analytics is now essential across all stages of the trading lifecycle, from pre-trade analytics to execution strategies and transaction cost analysis (TCA). Traditionally, post-trade analytics dominated the scene, but the focus has shifted towards real-time data analysis to capture alpha and manage complex market microstructures effectively.
Deep learning and reinforcement learning are becoming integral to AI-powered execution algos, informing decisions about trade placement, sizing, and execution. By monitoring market conditions and adapting in real-time, such AI-powered algos can offer competitive advantages, particularly in more challenging environments such as volatile markets. One panellist highlighted how integrating AI into traditional execution systems can also help to assess broker performance and detect trading anomalies.
Another panellist, sharing the practical perspective of someone who actively trades in electronic markets, noted a shift in the execution teams’ role, collaborating more closely with research and data teams to achieve better client outcomes and operational efficiencies. TCA, which once served as a post-trade evaluation tool, has transformed into an intraday process, enabling the team to adapt strategies and improve outcomes dynamically.
Ownership of data was identified as a key competitive advantage. By owning and analysing proprietary data, firms can develop unique insights and implement meaningful metrics to gauge performance, such as timing, sizing, and order types. A partnership with a data analytics provider enabled this speaker’s firm to enhance its trading efficiency. The process of learning and deploying analytics effectively has made data-driven trading more engaging and valuable.
Data Commoditisation and its Challenges
A debate on the commoditisation of data is genuinely surfaced during the session, with some arguing that despite the increasing volume and availability of data, a significant amount of time is still spent “cleaning up” and normalising data before deriving any actionable insights. Particularly for machine learning models, having access to clean, consistent, and historical data is critical for developing reliable models. Without this, quants often find themselves spending the bulk of their time in data preparation rather than value generation.
One panellist argued that the utility of data lies not in its volume but in its relevance to a specific use case. With a plethora of data types (e.g., level 1, level 2, level 3 data), the challenge is in focusing on data that directly supports the intended use case. For smaller firms without the resources to handle massive data operations, partnering with specialised analytics providers can open up new opportunities, offering pre-cleansed and pre-structured data, thus allowing for better and more efficient analytics capabilities.
AI, Machine Learning, and Generative AI
Part of the discussion clarified the distinctions between AI, machine learning, and generative AI. It was pointed out that the public perception of AI drastically changed in 2022, largely due to the rise of generative AI models like ChatGPT. While AI traditionally referred to techniques such as statistical modeling, regression, and time-series analysis, the advent of generative AI brought new capabilities, enabling models to create content such as text, images, and even code.
Despite the transformative potential of generative AI, panellists emphasised that traditional machine learning models still hold significant value, especially in systematic trading strategies. Techniques such as deep learning and reinforcement learning are fundamental to both the “older” AI models and newer generative AI systems. The critical difference lies in the applications, as the use cases for generative AI in trading are often more complex and less clearly defined compared to those for traditional AI models.
A point was made about the historical roots of many AI techniques, with some dating back to the 1970s. The current boom in AI is attributed to advancements in computational power rather than entirely new techniques.
Alternative Data and the Role of AI in Trading Analytics
One speaker explained that alternative data, encompassing non-traditional sources like satellite imagery, social media sentiment, and other unconventional data points, can provide additional market insights and uncover trading opportunities by evaluating factors that traditional market data might miss. For instance, monitoring foot traffic to retail stores via satellite imagery can provide clues about a company’s sales performance, thus impacting its stock price.
The integration of AI with these alternative data sets has the potential to enable firms to extract new insights and alpha. Generative AI in particular provides new opportunities to more easily analyse previously underutilised data sources, such as foreign-language news articles or voice recordings. While the traditional approach to building sentiment models might have taken weeks, generative AI allows for rapid experimentation and hypothesis testing, significantly reducing the time to insights.
Risks and Critical Success Factors for AI Integration
An audience poll revealed that “hallucination,” or AI producing false information, was perceived as one of the top risks in AI adoption. One panellist argued that this concern is not fundamentally new; model errors have always existed, but the risks can be mitigated with proper education and controlled use cases. Data exfiltration—where sensitive information might be unintentionally shared with third-party AI providers—was highlighted as a more pressing concern. Agreements with AI providers to limit data sharing and usage were mentioned as ways to address this risk.
Data quality remains crucial to AI success, as models are only as good as the data they are trained on. A consistent, well-curated dataset ensures that models produce reliable outputs. The iterative cycle of cleaning, analysing, training, and testing the data should be an ingrained part of developing any AI or ML model.
Buy AND Build: Strategic Investments in AI
In keeping with the overall theme of the day, panelists debated the pros and cons of building in-house AI capabilities versus buying external solutions. For many firms, adopting a mixed strategy—buying a foundational tool and building on top of it—offers a practical route to AI integration. Smaller trading desks might find that purchasing existing frameworks and customising them to fit their needs is more efficient than developing proprietary AI models from scratch. As the technology evolves, firms need to assess whether AI fits into their core operations or serves as a supplementary tool for enhancing specific business processes.
One of the key takeaways from the panel was the emphasis on AI as a tool for enhancing human decision-making rather than replacing it. While AI and machine learning can uncover patterns and insights that humans might miss, the role of human judgment and review remains crucial, particularly in client-facing contexts. AI’s value lies in augmenting human expertise, allowing teams to make more informed decisions quickly and effectively.
The panel discussion offered a comprehensive overview of AI’s evolving role in trading analytics and research. From the surge in real-time analytics to the potential of generative AI, it is evident that AI and ML technologies are transforming trading strategies and operations. However, successful integration hinges on clean data, clearly defined use cases, and a strategic balance between building and buying AI capabilities. As the technology continues to develop, the ability to deploy AI effectively will become a critical differentiator in the trading space.
View our full agenda and more details for the next TradingTech Summit London here or register below.
Subscribe to our newsletter