About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

ScaleOut Pushes Hadoop Towards Low-Latency for Real-Time Analytics

Subscribe to our newsletter

OK, so the headline is a tad extreme, but bear with me. Recent developments combining in-memory technologies and Hadoop/MapReduce from ScaleOut Software point to a future where big data analytics and real-time processing, as it’s defined in the financial markets, could meet.

ScaleOut has just released its ScaleOut hServer V2, an in-memory data grid, which it claims can boost Hadoop performance by 20x, and can make it suitable for processing ‘live data’ to deliver ‘rea-ltime analytics’.

“To minimise execution time, ScaleOut hServer employs numerous optimisations to minimise data motion during the execution of MapReduce applications, and it can automatically cache HDFS data sets within the IMDG (a feature introduced with ScaleOut hServer V1). In addition, ScaleOut hServer’s memory capacity and throughput can be scaled by adding servers to the IMDG’s cluster. The product automatically rebalances the data set and execution workload when servers are added or removed,” says the company in a statement.

As well as boosting performance of a Hadoop deployment, hServer also incorporates Map/Reduce logic so that a Hadoop distribution is not actually required – though the company suggests its offering is not a direct replacement for Hadoop.

Nevertheless, “ScaleOut hServer is designed to be compatible with most Java-based Hadoop Map/Reduce applications developed for the standard Hadoop distributions, requiring only a one-line code change to execute applications using ScaleOut hServer.”

The big picture here is that ScaleOut – as well as other companies pushing in-memory technology – is recognising that the batch-oriented nature of Hadoop has limitations for real-time applications, such as those found in the financial markets.

While ScaleOut is today looking to boost Hadoop performance to make applications that used to take hours and minutes to execute run now in minutes and seconds, the performance trajectory could well follow that of the low-latency space, where milliseconds gave way to microseconds, and now nanoseconds.

The deployment of multi-core and multi-socket servers, GPU technologies and advances in memory will all benefit data grid vendors like ScaleOut, as well as Hadoop and other big data analytics offerings.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to move to a modern, component based trading architecture using a Buy AND Build approach

Date: 7 May 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes To remain competitive in today’s electronic markets, firms need trading architectures that support rapid innovation, effortless integration of new capabilities, and the agility to respond to shifting market demands. This is prompting technology leaders to move beyond the traditional...

BLOG

BondWave Expands TQA Capabilities, Extends Execution Analytics in Latest Effi Release

BondWave, the fintech specialising in fixed income analytics and workflow tools, has rolled out a new release of its Effi platform that significantly expands the scope and depth of its Transaction Quality Analysis (TQA) capabilities, reflecting a broader industry push to bring greater rigour, context, and comparability to fixed income execution analytics. The latest enhancements...

EVENT

Eagle Alpha Alternative Data Conference, Spring, New York, hosted by A-Team Group

Now in its 9th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

Practical Applications of the Global LEI – Client On-Boarding and Beyond

The time for talking is over. The time for action is now. A bit melodramatic, perhaps, but given last month’s official launch of the global legal entity identifier (LEI) standard, practitioners are rolling up their sleeves and getting on with figuring out how to incorporate the new identifier into their customer and entity data infrastructures....