About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Pico Taps Intel ‘Ice Lake’ to Achieve Sustained 100Gbps Processing for Corvil Analytics

Subscribe to our newsletter

Trading infrastructure specialist Pico has teamed with Intel to achieve 100Gbps sustained real-time processing for its Corvil Analytics product, which is used by financial institutions to manage and analyse their network flow. The development comes hot on the heels of Intel Capital joining other investors in funding a $135 million Series C investment round for Pico, announced in December.

With early access to Intel’s 3rd Gen Intel Xeon Scalable processors (code-named ‘Ice Lake’), Pico says it immediately experienced a 50% improvement in performance with no change in code or optimisation. Higher levels of performance improvement have since been achieved by leveraging the processor’s new capabilities and features.

“The performance improvements have been outstanding, we’ve never seen that much of a performance increase between two CPU generations before,” says Roland Hamann, Pico’s CTO. “We went from 42Gbps sustained capture per second to over 100Gbps sustained capture per second. And in the lab tests, we’ve seen that we can actually do up to 200Gbps in bursts. On top of that, we are not just capturing the data but also decoding and analysing it in real-time. There’s really nothing else out there on the market that does that.”

To get the most out of the new processor, there was close collaboration between Pico’s and Intel’s engineering teams, says Hamann. “Of particular note was the ability to select critical code threads for specialized processor treatment, thereby efficiently eliminating many prior performance bottlenecks,” he says.

“The biggest multinational banks have network backbones that now run 100Gbps, their challenge up until now has been capturing and analysing the data on those networks,” says Hamann. “With the new unit, they can capture everything, the whole of the US equities market, the whole of the US futures market including Opra A and B feeds for gap detection on one single appliance.  They can then decode all this traffic, analyse it, and even make the data available via API for integration into something like Splunk or Kx’s kdb+, for further analysis.”

The company plans to roll out its eighth-generation appliances based upon Intel’s Ice Lake family of processors later this year. “We anticipate high demand for this, given the recent surge in trade and market data volumes, and increasing corporate network data rates,” says Hamann.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: From Data to Alpha: AI Strategies for Taming Unstructured Data

Date: 16 April 2026 Time: 9:00am ET / 2:00pm London / 3:00pm CET Duration: 50 minutes Unstructured data now accounts for the majority of information flowing through financial markets organisations, spanning research content, corporate disclosures, communications, alternative data, and internal documents.  While AI has created new opportunities to extract signal from this data, many firms...

BLOG

When Margin Moves Upstream: How TT is Reworking Trading Decisions After the OpenGamma Deal

More than a month after completing its acquisition of OpenGamma, Trading Technologies is beginning to articulate how the deal is intended to change the way firms think about margin, capital efficiency, and trading decision-making. Rather than positioning margin as a downstream risk or treasury concern, TT is now framing capital efficiency as a front-office variable...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Practicalities of Working with the Global LEI

This special report accompanies a webinar we held on the popular topic of The Practicalities of Working with the Global LEI, discussing the current thinking around best practices for entity identification and data management. You can register here to get immediate access to the Special Report.