About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Pico Taps Intel ‘Ice Lake’ to Achieve Sustained 100Gbps Processing for Corvil Analytics

Subscribe to our newsletter

Trading infrastructure specialist Pico has teamed with Intel to achieve 100Gbps sustained real-time processing for its Corvil Analytics product, which is used by financial institutions to manage and analyse their network flow. The development comes hot on the heels of Intel Capital joining other investors in funding a $135 million Series C investment round for Pico, announced in December.

With early access to Intel’s 3rd Gen Intel Xeon Scalable processors (code-named ‘Ice Lake’), Pico says it immediately experienced a 50% improvement in performance with no change in code or optimisation. Higher levels of performance improvement have since been achieved by leveraging the processor’s new capabilities and features.

“The performance improvements have been outstanding, we’ve never seen that much of a performance increase between two CPU generations before,” says Roland Hamann, Pico’s CTO. “We went from 42Gbps sustained capture per second to over 100Gbps sustained capture per second. And in the lab tests, we’ve seen that we can actually do up to 200Gbps in bursts. On top of that, we are not just capturing the data but also decoding and analysing it in real-time. There’s really nothing else out there on the market that does that.”

To get the most out of the new processor, there was close collaboration between Pico’s and Intel’s engineering teams, says Hamann. “Of particular note was the ability to select critical code threads for specialized processor treatment, thereby efficiently eliminating many prior performance bottlenecks,” he says.

“The biggest multinational banks have network backbones that now run 100Gbps, their challenge up until now has been capturing and analysing the data on those networks,” says Hamann. “With the new unit, they can capture everything, the whole of the US equities market, the whole of the US futures market including Opra A and B feeds for gap detection on one single appliance.  They can then decode all this traffic, analyse it, and even make the data available via API for integration into something like Splunk or Kx’s kdb+, for further analysis.”

The company plans to roll out its eighth-generation appliances based upon Intel’s Ice Lake family of processors later this year. “We anticipate high demand for this, given the recent surge in trade and market data volumes, and increasing corporate network data rates,” says Hamann.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

LSEG and Microsoft Deepen Partnership to Power AI Agents with Financial Data

The London Stock Exchange Group (LSEG) and Microsoft are taking the next step in their strategic partnership, announcing a plan to transform how financial services professionals interact with data by embedding LSEG’s vast datasets into Microsoft’s AI ecosystem. The collaboration will enable users to build and deploy customised AI agents, or ‘Copilots,’ within their daily...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Practical Applications of the Global LEI – Client On-Boarding and Beyond

The time for talking is over. The time for action is now. A bit melodramatic, perhaps, but given last month’s official launch of the global legal entity identifier (LEI) standard, practitioners are rolling up their sleeves and getting on with figuring out how to incorporate the new identifier into their customer and entity data infrastructures....