About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Analysis: Inside Supermicro’s Hyper-Speed HFT Server

Subscribe to our newsletter

Super Micro Computer last month introduced a new line of “Hyper-Speed” servers, pitching them at high frequency trading applications. The company says they work by “maximising processing power and precisely tuning hardware and firmware to attain up to 30% lower latency.” And just as importantly, the servers feature “high reliability as a primary design focus.” So what are these servers, and how does Supermicro boost their performance?

* They are based on dual Intel multi-core Xeon E5-2600 (Sandy Bridge) chips. The 2U model is pitched at space-limited co-location deployment. Crucially, they are “designed for maximum airflow and custom heat sinks provide optimal thermal distribution for mission critical reliability.” Which means they provide superior cooling for the chips and other components.

* This cooling is required because the microprocessors in the Supermicro servers are over clocked – they operate at clock frequencies higher than normal, which means they execute code faster. Which translates to lower latency.

* As an example, Supermicro tested network performance processing messages with the UDP and TCP protocols, and determined that over clocking reduced latency by 31%. Moreover, jitter was reduced by 73%.

* Over clocking of its microprocessors is generally frowned upon by Intel because the implementation of over clocking can shorten their life. But because of the cooling abilities of the Supermicro servers, the chips have the same warranty as standard products. Also, Supermicro tests all of the components in its servers – including RAM and network interface cards – to ensure they will operate with an over clocked microprocessor.

* Just to be clear, the performance of the Supermicro servers is not just down to microprocessor over clocking but also to other hardware and firmware optimisations incorporated into them.

My Take:

For applications that need the fastest execution of serial code – data feed handlers are an example – over clocking should be beneficial. And Supermicro’s implementation seemingly overcomes the reliabity issues often cited for over over clocking.

For some, this approach will have merit compared to turning to specialist processors, such as FPGAs, to handle certain processing. It will mean that traditional programming techniques can continue to be used instead of requiring specialist skills. And existing mainstream code can be readily deployed, and runs faster.

It’s worth noting that Dell recently introduced its Dell Processor Acceleration Technology, an alternate approach to boosting the processing power of microprocessors, also for HFT applications.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking Competitive Edge with Outsourcing and Managed Services in Trading Technology

Outsourcing has emerged as a strategic solution for capital markets firms as trading technology infrastructures become more complex, data volumes grow exponentially, and regulatory pressures intensify. .By leveraging third-party expertise, firms can optimise operations, reduce costs, and focus on innovation in their trading technology stack. Outsourcing potentially enables firms to scale seamlessly, meet regulatory reporting...

BLOG

HERE’s Enterprise Browser Claims Best Desktop Interoperability Framework at TradingTech Insight Awards USA 2025

Desktop interoperability has long been the missing piece in unifying fast-moving, multi-application trading workflows. At this year’s TradingTech Insight Awards USA, HERE – the enterprise browser born from OpenFin’s pioneering container technology – was recognised for closing that gap, capturing the award for Best Desktop Interoperability Framework. TradingTech Insight took the opportunity to sit down...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Pricing and Valuations

This special report accompanies a webinar we held a webinar on the popular topic of Pricing and Valuations, discussing issues such as transparency of pricing and how to ensure data quality. You can register here to get immediate access to the Special Report.