About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Pico Launches Corvil Electronic Trading Data Warehouse for Real-Time Execution Analytics

Subscribe to our newsletter

Trading technology and infrastructure specialist Pico has launched the Corvil Electronic Trading Data Warehouse, which aims to provide visibility into transaction execution quality to correlate client trading behaviour with execution path and counterparty performance.

The solution, offered as a standalone software product, streams nanosecond-timestamped data from Corvil network instrumentation, to deliver real-time visibility at a granular level from individual trades to aggregate transaction outcomes, providing clients with a lens into technology stack performance and its impact on trading. The launch follows the release of Corvil’s new flagship 100Gbps capture and analytics appliance in October 2021 which delivers sustained 100Gbps real-time processing.

“Many firms have been using Corvil Analytics to ingest market data and order flow, decoding it, normalising it and then streaming it to external time series databases,” says Donal O’Sullivan, Managing Director of Product Management at Pico. “But we found that there was a gap in terms of extracting maximum value from that data, people didn’t always know what to do with it. This new product consumes those streams, performs all sorts of multi-dimensional analysis, and makes the data available in a very consumable, flexible form.”

“If you are an execution broker and you use a smart order router for example, an order might be split into thousands of executions, coming back from different markets,” explains Roland Hamann, Pico’s Chief Technology Officer and Head of APAC. “And if you have three or four different algorithms that are working that order, it can get very complicated to reconstruct which part of that order got executed, where, when and how. This software does all the analytics and the merging together of the data behind the scenes for you. And it also allows you to enrich the data, with additional reference data for example.”

Artificial intelligence also features, says O’Sullivan. “By inferring the relationship between trading activity and servers in the trading environment, it builds a model of that environment, in order to report against it. We also use baselining and neural net algorithms across a wide range of metrics, such as order response times, message microbursts, network latency and so on, to spot anomalies or deviations and flag them up, either on a dashboard or via other alerting mechanisms.”

The architecture of the data warehouse enables it to run across platforms, says O’Sullivan. “It will run on bare metal, or on servers in your data centre, or even in AWS or Google Cloud, if you want to deploy it on a cloud environment.”

The system uses a Hadoop Distributed File System (HDFS) to store data, with full support for the Kafka message bus, which enables query and extraction of the data via API as well as via Pico’s own dashboards.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: From Data to Alpha: AI Strategies for Taming Unstructured Data

Date: 16 April 2026 Time: 9:00am ET / 2:00pm London / 3:00pm CET Duration: 50 minutes Unstructured data now accounts for the majority of information flowing through financial markets organisations, spanning research content, corporate disclosures, communications, alternative data, and internal documents.  While AI has created new opportunities to extract signal from this data, many firms...

BLOG

Bloomberg BQuant Wins A-Team AICM Best AI Solution for Historical Data Analysis Award

When global markets were roiled by the announcement of massive US trade tariffs, Bloomberg saw the amount of financial and other data that runs through its systems surge to 600 billion data points, almost double the 400 billion it manages on an average day. “These were just mind-blowingly large volumes of data,” says James Jarvis,...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Practical Applications of the Global LEI – Client On-Boarding and Beyond

The time for talking is over. The time for action is now. A bit melodramatic, perhaps, but given last month’s official launch of the global legal entity identifier (LEI) standard, practitioners are rolling up their sleeves and getting on with figuring out how to incorporate the new identifier into their customer and entity data infrastructures....