About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Intelligent Trading Summit: Leveraging Operational Analytics

Subscribe to our newsletter

Operational analytics are beginning to appear in trading workflows with the aim of mimicking the success of latency measurement and its ability to optimise trading connections for business advantage, but are operational analytics a step too far or can they, too, deliver benefits? These questions and more were raised during an expert panel session at last week’s A-Team Group Intelligent Trading Summit.

Moderating the panel, Peter Farley, director at A-Team Group, asked how operational analytics are developing and where their potential lies. Moving on from established latency measurement, panel members described the need for predictive rather than reactive analytics, network visibility and data visualisation to support trade flow monitoring, and the possibility of moving trading systems and monitoring solutions into the cloud to reduce total cost of ownership.

They also touched on extending latency beyond monitoring and compliance, and exploiting it for operational gains. Gil Tene, chief technology officer and co-founder of Azul Systems, explained: “Some firms are playing with the possibilities of latency, but most firms are still trying to get a handle on it. There is still room for improvement in latency measurement for applications such as risk management.”

While latency and other performance measures may improve operational performance, Kevin Covington, CEO of ITRS Group, warned that operational data, without significant improvement, is not suitable to be fed into business systems that are used to make decisions based on the validity of data.

Henry Young, founder and CEO of TS-Associates, agreed with Covington, saying: “Data from data monitoring solutions is used before and after events for infrastructure optimisation and to facilitate the build of an ideal electronic trading environment. Some people use real-time latency data for functions such as smart order routing, but these are decisions about sending orders to market, not business decisions.”

Turning to best practice in operational analytics, Charles Barry, chief technology officer at Jolata, noted the need for accurate timing in the trading network and at the process point to achieve real insight into what is happening, while Young suggested the game is not just in monitoring problems in workflow, but also in monitoring the application layers that underlie the workflow.

Answering a question from the audience about the disruptive nature of decentralisation, Covington said: “More outsourcing and more software-as-a-service solutions are being plugged into the trading environment, which means we are losing visibility of some areas, although the need is to manage those areas even though they are not in the organisation.”

Finally, considering the increasing amount of data in trading operations and its inherent risk, the panellists agreed that analytics require regular reality checks and must always be tested in a firm’s own trading environment.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and reduce costs. Use cases of specific importance to the finance sector, such as data...

BLOG

LSEG Launches MarketPsych Transcript Analytics for Corporate Transcripts

LSEG Data & Analytics has introduced MarketPsych Transcript Analytics, a structured data feed designed to help institutional and quantitative investors extract actionable intelligence from corporate transcripts. Developed in collaboration with MarketPsych, the tool uses Natural Language Processing (NLP) to analyse earnings calls, investor presentations, and other corporate communications from over 16,000 global public companies. The...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

FRTB Special Report

FRTB is one of the most sweeping and transformative pieces of regulation to hit the financial markets in the last two decades. With the deadline confirmed as January 2022, this Special Report provides a detailed insight into exactly what the data requirements are for FRTB in its latest (and final) incarnation, and explores what needs to be done in order to meet these needs on a cost-effective and company-wide basis.