About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Cantor Evaluating Calxeda ARM Chips for 10x Breakthrough

Subscribe to our newsletter

“I think the Calxeda-ARM machine is an exciting step … I’m evaluating carefully how it can impact the metrics I care about,” says Niall Dalton, director of high frequency trading at Cantor Fitzgerald. He is referring to today’s announcement by Calxeda of their very low power microprocessors based on the ARM architecture – and HP’s plan to build servers based on them.

ARM-based chips run on very low power, and are used by many manufacturers of consumer devices, such as mobile phones. Austin, Texas-based Calxeda is, however, building its chips for highly parallel server designs.

The initial EnergyCore processor – or Server on a Chip – from Calxeda includes four ARM cores, 4MB of L2 cache memory, an 80 gigabit per second interconnect and system/power management functions – all requiring just 1.5 watts of power.

HP will build servers with 288 EnergyCores in a 4U appliance. “A single rack of HP’s Calxeda servers delivers the throughput of some 700 traditional servers and dramatically simplifies the infrastructure needed to hook them all together and manage the cluster,” claims Calxeda co-founder and CEO Barry Evans.

“Companies in our industry are constrained by space and power, yet our appetite for analysis is insatiable,” says Cantor’s Dalton, who continues: “We need a 10x breakthrough and this could be it. We are evaluating the Calxeda technology in hyperscale throughput computing for data and simulation intensive applications. The Calxeda Linux platform enables rapid porting of our software, enabling us to quickly leverage the energy-efficient ARM cores and Calxeda’s scalable communications fabric to scale our applications to new heights.”

For financial markets applications, it looks like Calxeda’s performance/power footprint could be a winner for those firms needing to mine data to develop pre-trade models and post-trade simulations – as fast as possible.  And where those systems are in outsourced managed environments, and possibly in proximity and co-lo centres, the operations costs related to space and power can be considerable.
 

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

Bank of England Targets ‘Critical Data Gaps’ in New $16 Trillion Private Markets Stress Test

The Bank of England (BoE) has launched its second System-Wide Exploratory Scenario (SWES) exercise, turning its regulatory lens toward the opaque and rapidly expanding private markets ecosystem. Following its initial SWES exercise, which focused on gilts and corporate bond markets, the central bank is now targeting the “critical data gaps” inherent in private equity (PE)...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Enterprise Data Management

The current financial crisis has highlighted that financial institutions do not have a sufficient handle on their data and has prompted many of these institutions to re-evaluate their approaches to data management. Moreover, the increased regulatory scrutiny of the financial services community during the past year has meant that data management has become a key...