About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

In-Memory Heats Up for Low Latency and Big Data

Subscribe to our newsletter

Last week I was in Orlando for SAP’s SAPPHIRE NOW event, where the main focus was not on the company’s highly successful business and data warehouse applications – revenues in 2012 were around $20 billion – but on a product introduced in late 2010 and said to be its fastest selling ever. Mysteriously named HANA – some say in a nod to the company’s founder and current chairman Hasso Plattner, while others suggest it stands for High Performance Analytics Appliance – the product is an in-memory database, and it is a key player in a technology space that is heating up fast, with applications for both low latency and big data applications.

Performance is the driver of the move to in-memory. Accessing data in a server’s RAM (Random Access Memory) is about 100,000 times faster than accessing it from a hard disk. Since most trading applications – especially those implementing intelligent trading approaches – need to access some data to augment what might be contained in a price update, keeping that in RAM is going to reduce tick-to-trade latency.

Truth be told, in-memory is nothing new. Just about all traditional disk-oriented databases – including those that came to SAP in its 2010 acquisition of Sybase, and financial markets-oriented offerings from Kx Systems – cache data in memory as part of a data retrieval process that is hard disk centric. HANA, though, is designed so that RAM is its primary data store, with hard disk (and non-volatile flash memory) for persistence.

Even as an in-memory database, HANA is also not that new, or unique. One of the earliest in-memory databases that enjoyed commercial success (especially in the financial markets) was the TimesTen offering, spun out of Hewlett-Packard in 1996 (and acquired by Oracle in 2005).

What is new is that technology advances have made in-memory approaches more usable, and cost effective. 64-bit processor architectures can now address much larger RAM memory address spaces, servers can now pack in many terabytes of RAM, network protocols like RDMA can now connect servers together with very low latency so in-memory can scale out, and – last but not least – RAM is getting cheaper, and cheaper.

SAP isn’t the only company with an in-memory offering, as several vendors have been turned on to its promise. Those include Software AG’s Terracotta unit (now under the stewardship of former Tibco Software exec Robin Gilthorpe), Tibco with ActiveSpaces, GigaSpaces Technologies with XAP, McObject’s eXtremeDB, GridGain, ScaleOut Software, and the new EMC/VMware Pivotal venture, which has sucked in GemFire as a key component.

Interestingly, in-memory is being explored not only by the low-latency world, but also by those looking to leverage big data approaches, such as Hadoop, and finding performance is an issue. Data warehouse vendor Teradata recently introduced its Intelligent Memory offering, which adds an in-memory component to its hard disk/flash product. Teradata determined that its users’ implementations made 90% of data queries on just 20% of the data stored, and uses algorithms to keep what it terms ‘hot’ data in RAM for fast retrieval.

This hot data approach to cutting big data down to (less) size underpins most of the in-memory offerings, including HANA, though the implementation differs from product to product, and is certainly a ‘devil in the detail’ aspect to be understood when implementing this technology. Watch this space …

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

4 September 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and...

BLOG

Beyond the AI Hype: Six Trading Technology Trends to Watch in 2025

The trading technology landscape is heading into 2025 with unprecedented momentum, driven by a convergence of regulatory changes, market structure reforms, and advances in core infrastructure technologies. While artificial intelligence dominates much of the conversation around fintech, the year ahead will also be shaped by broader, practical shifts—from faster settlement cycles and rising data costs...

EVENT

AI in Capital Markets Summit New York

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Entity Data Management Handbook – Fourth Edition

Welcome to the fourth edition of A-Team Group’s Entity Data Management Handbook sponsored by entity data specialist Bureau van Dijk, a Moody’s Analytics company. As entity data takes a central role in business strategies dedicated to making the customer experience markedly better, this handbook delves into the detail of everything you need to do to...