About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

STAC Benchmarks IBM’s Hadoop

Subscribe to our newsletter

STAC – aka the Securities Technology Analysis Center – has benchmarked IBM’s proprietary Platform Symphony implementation of Hadoop MapReduce, versus the standard open source offering, to compare their respective performance. On average, IBM’s implementation performed jobs 7.3 times faster than the standard, reducing total processing time by a factor of six.

Better known for its benchmarking of low-latency trading platforms, STAC leveraged the Statistical Workload Injector for MapReduce (SWIM), developed by the University of California at Berkeley. SWIM provides a large set of diverse MapReduce jobs based on production Hadoop traces obtained from Facebook, along with information to enable characterisation of each job. STAC says it undertook the benchmarking because many financial markets firms are deploying Hadoop.

The hardware environment for the testbed consisted of 17 IBM compute servers and one master server communicating over gigabit Ethernet. STAC compared Hadoop version 1.0.1 to Symphony version 5.2. Both systems ran Red Hat Linux and used largely default configurations.

IBM attributes the superior performance of its offering in part to its scheduling speed. IBM’s Hadoop is API-compatible with the open source offering but runs on the Symphony grid middleware that became IBM’s with its aquisition of Platform Computing, which closed in January of this year.

For more information on STAC’s IBM Hadoop benchmark, see here.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

Slaying the Monolith: A Pragmatist’s Guide to Modernising Trading Architecture

For decades, trading technology has been haunted by large, intricate, all-in-one applications that power core business functions, aka the monolith. While once a necessity, these systems have become a source of immense friction. They are brittle, expensive to maintain, and notoriously slow to change, creating a chasm between business demands for agility and IT’s capacity...

EVENT

TradingTech Summit New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Enterprise Data Management

The current financial crisis has highlighted that financial institutions do not have a sufficient handle on their data and has prompted many of these institutions to re-evaluate their approaches to data management. Moreover, the increased regulatory scrutiny of the financial services community during the past year has meant that data management has become a key...