About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Pico Taps Intel ‘Ice Lake’ to Achieve Sustained 100Gbps Processing for Corvil Analytics

Subscribe to our newsletter

Trading infrastructure specialist Pico has teamed with Intel to achieve 100Gbps sustained real-time processing for its Corvil Analytics product, which is used by financial institutions to manage and analyse their network flow. The development comes hot on the heels of Intel Capital joining other investors in funding a $135 million Series C investment round for Pico, announced in December.

With early access to Intel’s 3rd Gen Intel Xeon Scalable processors (code-named ‘Ice Lake’), Pico says it immediately experienced a 50% improvement in performance with no change in code or optimisation. Higher levels of performance improvement have since been achieved by leveraging the processor’s new capabilities and features.

“The performance improvements have been outstanding, we’ve never seen that much of a performance increase between two CPU generations before,” says Roland Hamann, Pico’s CTO. “We went from 42Gbps sustained capture per second to over 100Gbps sustained capture per second. And in the lab tests, we’ve seen that we can actually do up to 200Gbps in bursts. On top of that, we are not just capturing the data but also decoding and analysing it in real-time. There’s really nothing else out there on the market that does that.”

To get the most out of the new processor, there was close collaboration between Pico’s and Intel’s engineering teams, says Hamann. “Of particular note was the ability to select critical code threads for specialized processor treatment, thereby efficiently eliminating many prior performance bottlenecks,” he says.

“The biggest multinational banks have network backbones that now run 100Gbps, their challenge up until now has been capturing and analysing the data on those networks,” says Hamann. “With the new unit, they can capture everything, the whole of the US equities market, the whole of the US futures market including Opra A and B feeds for gap detection on one single appliance.  They can then decode all this traffic, analyse it, and even make the data available via API for integration into something like Splunk or Kx’s kdb+, for further analysis.”

The company plans to roll out its eighth-generation appliances based upon Intel’s Ice Lake family of processors later this year. “We anticipate high demand for this, given the recent surge in trade and market data volumes, and increasing corporate network data rates,” says Hamann.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

Discover How AI, Modular Architectures, and 24/7 Markets Are Reshaping the Future of Trading Technology

Now in its 13th year, A-Team Group’s TradingTech Briefing New York returns on June 24th to convene leading technologists, strategists, and innovators from across the buy side and sell side for a focused look at how trading infrastructure is evolving to meet the demands of a rapidly changing market landscape. From AI-powered transformation in the...

EVENT

TradingTech Summit New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

The Global LEI System – A Solution for Entity Data?

The Global LEI System – or GLEIS – has been in development since the middle of last year. Development has been patchy at times, but much has been done, leaving fewer outstanding issues, but also raising new questions. What’s emerging is a structure for the GLEIS going forward, complete with a mechanism for registering and...