Operational analytics are beginning to appear in trading workflows with the aim of mimicking the success of latency measurement and its ability to optimise trading connections for business advantage, but are operational analytics a step too far or can they, too, deliver benefits? These questions and more were raised during an expert panel session at last week’s A-Team Group Intelligent Trading Summit.
Moderating the panel, Peter Farley, director at A-Team Group, asked how operational analytics are developing and where their potential lies. Moving on from established latency measurement, panel members described the need for predictive rather than reactive analytics, network visibility and data visualisation to support trade flow monitoring, and the possibility of moving trading systems and monitoring solutions into the cloud to reduce total cost of ownership.
They also touched on extending latency beyond monitoring and compliance, and exploiting it for operational gains. Gil Tene, chief technology officer and co-founder of Azul Systems, explained: “Some firms are playing with the possibilities of latency, but most firms are still trying to get a handle on it. There is still room for improvement in latency measurement for applications such as risk management.”
While latency and other performance measures may improve operational performance, Kevin Covington, CEO of ITRS Group, warned that operational data, without significant improvement, is not suitable to be fed into business systems that are used to make decisions based on the validity of data.
Henry Young, founder and CEO of TS-Associates, agreed with Covington, saying: “Data from data monitoring solutions is used before and after events for infrastructure optimisation and to facilitate the build of an ideal electronic trading environment. Some people use real-time latency data for functions such as smart order routing, but these are decisions about sending orders to market, not business decisions.”
Turning to best practice in operational analytics, Charles Barry, chief technology officer at Jolata, noted the need for accurate timing in the trading network and at the process point to achieve real insight into what is happening, while Young suggested the game is not just in monitoring problems in workflow, but also in monitoring the application layers that underlie the workflow.
Answering a question from the audience about the disruptive nature of decentralisation, Covington said: “More outsourcing and more software-as-a-service solutions are being plugged into the trading environment, which means we are losing visibility of some areas, although the need is to manage those areas even though they are not in the organisation.”
Finally, considering the increasing amount of data in trading operations and its inherent risk, the panellists agreed that analytics require regular reality checks and must always be tested in a firm’s own trading environment.
Subscribe to our newsletter