About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Pico Taps Intel ‘Ice Lake’ to Achieve Sustained 100Gbps Processing for Corvil Analytics

Subscribe to our newsletter

Trading infrastructure specialist Pico has teamed with Intel to achieve 100Gbps sustained real-time processing for its Corvil Analytics product, which is used by financial institutions to manage and analyse their network flow. The development comes hot on the heels of Intel Capital joining other investors in funding a $135 million Series C investment round for Pico, announced in December.

With early access to Intel’s 3rd Gen Intel Xeon Scalable processors (code-named ‘Ice Lake’), Pico says it immediately experienced a 50% improvement in performance with no change in code or optimisation. Higher levels of performance improvement have since been achieved by leveraging the processor’s new capabilities and features.

“The performance improvements have been outstanding, we’ve never seen that much of a performance increase between two CPU generations before,” says Roland Hamann, Pico’s CTO. “We went from 42Gbps sustained capture per second to over 100Gbps sustained capture per second. And in the lab tests, we’ve seen that we can actually do up to 200Gbps in bursts. On top of that, we are not just capturing the data but also decoding and analysing it in real-time. There’s really nothing else out there on the market that does that.”

To get the most out of the new processor, there was close collaboration between Pico’s and Intel’s engineering teams, says Hamann. “Of particular note was the ability to select critical code threads for specialized processor treatment, thereby efficiently eliminating many prior performance bottlenecks,” he says.

“The biggest multinational banks have network backbones that now run 100Gbps, their challenge up until now has been capturing and analysing the data on those networks,” says Hamann. “With the new unit, they can capture everything, the whole of the US equities market, the whole of the US futures market including Opra A and B feeds for gap detection on one single appliance.  They can then decode all this traffic, analyse it, and even make the data available via API for integration into something like Splunk or Kx’s kdb+, for further analysis.”

The company plans to roll out its eighth-generation appliances based upon Intel’s Ice Lake family of processors later this year. “We anticipate high demand for this, given the recent surge in trade and market data volumes, and increasing corporate network data rates,” says Hamann.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

Are Legacy Systems Undermining the Future of Wealth Management?

The polished digital façade of a modern private bank can sometimes conceal a more brittle and complex reality: a tech stack strained by its own history. For technology leaders working within this environment, the strategic challenges are significant. And while industry narratives often highlight external market pressures and disruptive fintech, some of the most persistent...

EVENT

TradingTech Summit New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Entity Data Management Handbook – Fourth Edition

Welcome to the fourth edition of A-Team Group’s Entity Data Management Handbook sponsored by entity data specialist Bureau van Dijk, a Moody’s Analytics company. As entity data takes a central role in business strategies dedicated to making the customer experience markedly better, this handbook delves into the detail of everything you need to do to...