About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

STAC Benchmarks IBM’s Hadoop

Subscribe to our newsletter

STAC – aka the Securities Technology Analysis Center – has benchmarked IBM’s proprietary Platform Symphony implementation of Hadoop MapReduce, versus the standard open source offering, to compare their respective performance. On average, IBM’s implementation performed jobs 7.3 times faster than the standard, reducing total processing time by a factor of six.

Better known for its benchmarking of low-latency trading platforms, STAC leveraged the Statistical Workload Injector for MapReduce (SWIM), developed by the University of California at Berkeley. SWIM provides a large set of diverse MapReduce jobs based on production Hadoop traces obtained from Facebook, along with information to enable characterisation of each job. STAC says it undertook the benchmarking because many financial markets firms are deploying Hadoop.

The hardware environment for the testbed consisted of 17 IBM compute servers and one master server communicating over gigabit Ethernet. STAC compared Hadoop version 1.0.1 to Symphony version 5.2. Both systems ran Red Hat Linux and used largely default configurations.

IBM attributes the superior performance of its offering in part to its scheduling speed. IBM’s Hadoop is API-compatible with the open source offering but runs on the Symphony grid middleware that became IBM’s with its aquisition of Platform Computing, which closed in January of this year.

For more information on STAC’s IBM Hadoop benchmark, see here.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: From Data to Alpha: AI Strategies for Taming Unstructured Data

Unstructured data and text now accounts for the majority of information flowing through financial markets organisations, spanning research content, corporate disclosures, communications, alternative data, and internal documents. While AI has created new opportunities to extract signals, many firms are discovering that value is constrained not by models, but by the quality of the content, architecture,...

BLOG

When Margin Moves Upstream: How TT is Reworking Trading Decisions After the OpenGamma Deal

More than a month after completing its acquisition of OpenGamma, Trading Technologies is beginning to articulate how the deal is intended to change the way firms think about margin, capital efficiency, and trading decision-making. Rather than positioning margin as a downstream risk or treasury concern, TT is now framing capital efficiency as a front-office variable...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Enterprise Data Management, 2010 Edition

The global regulatory community has become increasingly aware of the data management challenge within financial institutions, as it struggles with its own challenge of better tracking systemic risk across financial markets. The US regulator in particular is seemingly keen to kick off a standardisation process and also wants the regulatory community to begin collecting additional...