About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Intelligent Trading Summit: Leveraging Operational Analytics

Subscribe to our newsletter

Operational analytics are beginning to appear in trading workflows with the aim of mimicking the success of latency measurement and its ability to optimise trading connections for business advantage, but are operational analytics a step too far or can they, too, deliver benefits? These questions and more were raised during an expert panel session at last week’s A-Team Group Intelligent Trading Summit.

Moderating the panel, Peter Farley, director at A-Team Group, asked how operational analytics are developing and where their potential lies. Moving on from established latency measurement, panel members described the need for predictive rather than reactive analytics, network visibility and data visualisation to support trade flow monitoring, and the possibility of moving trading systems and monitoring solutions into the cloud to reduce total cost of ownership.

They also touched on extending latency beyond monitoring and compliance, and exploiting it for operational gains. Gil Tene, chief technology officer and co-founder of Azul Systems, explained: “Some firms are playing with the possibilities of latency, but most firms are still trying to get a handle on it. There is still room for improvement in latency measurement for applications such as risk management.”

While latency and other performance measures may improve operational performance, Kevin Covington, CEO of ITRS Group, warned that operational data, without significant improvement, is not suitable to be fed into business systems that are used to make decisions based on the validity of data.

Henry Young, founder and CEO of TS-Associates, agreed with Covington, saying: “Data from data monitoring solutions is used before and after events for infrastructure optimisation and to facilitate the build of an ideal electronic trading environment. Some people use real-time latency data for functions such as smart order routing, but these are decisions about sending orders to market, not business decisions.”

Turning to best practice in operational analytics, Charles Barry, chief technology officer at Jolata, noted the need for accurate timing in the trading network and at the process point to achieve real insight into what is happening, while Young suggested the game is not just in monitoring problems in workflow, but also in monitoring the application layers that underlie the workflow.

Answering a question from the audience about the disruptive nature of decentralisation, Covington said: “More outsourcing and more software-as-a-service solutions are being plugged into the trading environment, which means we are losing visibility of some areas, although the need is to manage those areas even though they are not in the organisation.”

Finally, considering the increasing amount of data in trading operations and its inherent risk, the panellists agreed that analytics require regular reality checks and must always be tested in a firm’s own trading environment.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to move to a modern, component based trading architecture using a Buy AND Build approach

Date: 7 May 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes To remain competitive in today’s electronic markets, firms need trading architectures that support rapid innovation, effortless integration of new capabilities, and the agility to respond to shifting market demands. This is prompting technology leaders to move beyond the traditional...

BLOG

DiffusionData Targets Agentic AI in Finance with New MCP Server

Data technology firm DiffusionData has released an open-source server designed to connect Large Language Models (LLMs) with real-time data streams, aiming to facilitate the development of Agentic AI in financial services. The new Diffusion MCP Server uses the Model Context Protocol (MCP), an open standard for AI models to interact with external tools and data...

EVENT

AI in Data Management Summit New York City

Following the success of the 15th Data Management Summit NYC, A-Team Group are excited to announce our new event: AI in Data Management Summit NYC!

GUIDE

Complex Event Processing

Over the past couple of years, Complex Event Processing has emerged as a hot technology for the financial markets, and its flexibility has been leveraged in applications as diverse as market data cleansing, to algorithmic trading, to compliance monitoring, to risk management. CEP is a solution to many problems, which is one reason why the...