About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Pico Launches Corvil Electronic Trading Data Warehouse for Real-Time Execution Analytics

Subscribe to our newsletter

Trading technology and infrastructure specialist Pico has launched the Corvil Electronic Trading Data Warehouse, which aims to provide visibility into transaction execution quality to correlate client trading behaviour with execution path and counterparty performance.

The solution, offered as a standalone software product, streams nanosecond-timestamped data from Corvil network instrumentation, to deliver real-time visibility at a granular level from individual trades to aggregate transaction outcomes, providing clients with a lens into technology stack performance and its impact on trading. The launch follows the release of Corvil’s new flagship 100Gbps capture and analytics appliance in October 2021 which delivers sustained 100Gbps real-time processing.

“Many firms have been using Corvil Analytics to ingest market data and order flow, decoding it, normalising it and then streaming it to external time series databases,” says Donal O’Sullivan, Managing Director of Product Management at Pico. “But we found that there was a gap in terms of extracting maximum value from that data, people didn’t always know what to do with it. This new product consumes those streams, performs all sorts of multi-dimensional analysis, and makes the data available in a very consumable, flexible form.”

“If you are an execution broker and you use a smart order router for example, an order might be split into thousands of executions, coming back from different markets,” explains Roland Hamann, Pico’s Chief Technology Officer and Head of APAC. “And if you have three or four different algorithms that are working that order, it can get very complicated to reconstruct which part of that order got executed, where, when and how. This software does all the analytics and the merging together of the data behind the scenes for you. And it also allows you to enrich the data, with additional reference data for example.”

Artificial intelligence also features, says O’Sullivan. “By inferring the relationship between trading activity and servers in the trading environment, it builds a model of that environment, in order to report against it. We also use baselining and neural net algorithms across a wide range of metrics, such as order response times, message microbursts, network latency and so on, to spot anomalies or deviations and flag them up, either on a dashboard or via other alerting mechanisms.”

The architecture of the data warehouse enables it to run across platforms, says O’Sullivan. “It will run on bare metal, or on servers in your data centre, or even in AWS or Google Cloud, if you want to deploy it on a cloud environment.”

The system uses a Hadoop Distributed File System (HDFS) to store data, with full support for the Kafka message bus, which enables query and extraction of the data via API as well as via Pico’s own dashboards.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: From Data to Alpha: AI Strategies for Taming Unstructured Data

Date: 16 April 2026 Time: 9:00am ET / 2:00pm London / 3:00pm CET Duration: 50 minutes Unstructured data and text now accounts for the majority of information flowing through financial markets organisations, spanning research content, corporate disclosures, communications, alternative data, and internal documents. While AI has created new opportunities to extract signals, many firms are...

BLOG

LSEG and Bank of America Target AI-ready, Governed Data Integration in Multi-Year Partnership

London Stock Exchange Group (LSEG) and Bank of America have agreed a multi-year strategic partnership centred on embedding governed, AI-ready data and analytics directly into the bank’s core workflows. Rather than a distribution agreement focused on access, the collaboration reflects a broader architectural shift: integrating unified, rights-cleared content, analytics and risk intelligence across advisory, trading,...

EVENT

TradingTech Summit New York

Our TradingTech Summit in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again. But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I...