Trading technology and infrastructure specialist Pico has launched the Corvil Electronic Trading Data Warehouse, which aims to provide visibility into transaction execution quality to correlate client trading behaviour with execution path and counterparty performance.
The solution, offered as a standalone software product, streams nanosecond-timestamped data from Corvil network instrumentation, to deliver real-time visibility at a granular level from individual trades to aggregate transaction outcomes, providing clients with a lens into technology stack performance and its impact on trading. The launch follows the release of Corvil’s new flagship 100Gbps capture and analytics appliance in October 2021 which delivers sustained 100Gbps real-time processing.
“Many firms have been using Corvil Analytics to ingest market data and order flow, decoding it, normalising it and then streaming it to external time series databases,” says Donal O’Sullivan, Managing Director of Product Management at Pico. “But we found that there was a gap in terms of extracting maximum value from that data, people didn’t always know what to do with it. This new product consumes those streams, performs all sorts of multi-dimensional analysis, and makes the data available in a very consumable, flexible form.”
“If you are an execution broker and you use a smart order router for example, an order might be split into thousands of executions, coming back from different markets,” explains Roland Hamann, Pico’s Chief Technology Officer and Head of APAC. “And if you have three or four different algorithms that are working that order, it can get very complicated to reconstruct which part of that order got executed, where, when and how. This software does all the analytics and the merging together of the data behind the scenes for you. And it also allows you to enrich the data, with additional reference data for example.”
Artificial intelligence also features, says O’Sullivan. “By inferring the relationship between trading activity and servers in the trading environment, it builds a model of that environment, in order to report against it. We also use baselining and neural net algorithms across a wide range of metrics, such as order response times, message microbursts, network latency and so on, to spot anomalies or deviations and flag them up, either on a dashboard or via other alerting mechanisms.”
The architecture of the data warehouse enables it to run across platforms, says O’Sullivan. “It will run on bare metal, or on servers in your data centre, or even in AWS or Google Cloud, if you want to deploy it on a cloud environment.”
The system uses a Hadoop Distributed File System (HDFS) to store data, with full support for the Kafka message bus, which enables query and extraction of the data via API as well as via Pico’s own dashboards.
Subscribe to our newsletter