About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Pico Taps Intel ‘Ice Lake’ to Achieve Sustained 100Gbps Processing for Corvil Analytics

Subscribe to our newsletter

Trading infrastructure specialist Pico has teamed with Intel to achieve 100Gbps sustained real-time processing for its Corvil Analytics product, which is used by financial institutions to manage and analyse their network flow. The development comes hot on the heels of Intel Capital joining other investors in funding a $135 million Series C investment round for Pico, announced in December.

With early access to Intel’s 3rd Gen Intel Xeon Scalable processors (code-named ‘Ice Lake’), Pico says it immediately experienced a 50% improvement in performance with no change in code or optimisation. Higher levels of performance improvement have since been achieved by leveraging the processor’s new capabilities and features.

“The performance improvements have been outstanding, we’ve never seen that much of a performance increase between two CPU generations before,” says Roland Hamann, Pico’s CTO. “We went from 42Gbps sustained capture per second to over 100Gbps sustained capture per second. And in the lab tests, we’ve seen that we can actually do up to 200Gbps in bursts. On top of that, we are not just capturing the data but also decoding and analysing it in real-time. There’s really nothing else out there on the market that does that.”

To get the most out of the new processor, there was close collaboration between Pico’s and Intel’s engineering teams, says Hamann. “Of particular note was the ability to select critical code threads for specialized processor treatment, thereby efficiently eliminating many prior performance bottlenecks,” he says.

“The biggest multinational banks have network backbones that now run 100Gbps, their challenge up until now has been capturing and analysing the data on those networks,” says Hamann. “With the new unit, they can capture everything, the whole of the US equities market, the whole of the US futures market including Opra A and B feeds for gap detection on one single appliance.  They can then decode all this traffic, analyse it, and even make the data available via API for integration into something like Splunk or Kx’s kdb+, for further analysis.”

The company plans to roll out its eighth-generation appliances based upon Intel’s Ice Lake family of processors later this year. “We anticipate high demand for this, given the recent surge in trade and market data volumes, and increasing corporate network data rates,” says Hamann.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

Past, Present, and Future of AI and Machine Learning in Trading and Investment Management

On this episode of FinTech Focus TV recorded at A-Team Group’s Buy AND Build Summit, Toby Babb of Harrington Starr sits down with David Marcos, Founder and Managing Partner at Quantoro Technologies, to explore how AI agents are redefining trading, portfolio management, and the investor experience. From simplifying complex investment strategies to the rise of...

EVENT

AI in Capital Markets Summit London

Now in its 2nd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

Enterprise Data Management, 2010 Edition

The global regulatory community has become increasingly aware of the data management challenge within financial institutions, as it struggles with its own challenge of better tracking systemic risk across financial markets. The US regulator in particular is seemingly keen to kick off a standardisation process and also wants the regulatory community to begin collecting additional...