About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

In-Memory Heats Up for Low Latency and Big Data

Subscribe to our newsletter

Last week I was in Orlando for SAP’s SAPPHIRE NOW event, where the main focus was not on the company’s highly successful business and data warehouse applications – revenues in 2012 were around $20 billion – but on a product introduced in late 2010 and said to be its fastest selling ever. Mysteriously named HANA – some say in a nod to the company’s founder and current chairman Hasso Plattner, while others suggest it stands for High Performance Analytics Appliance – the product is an in-memory database, and it is a key player in a technology space that is heating up fast, with applications for both low latency and big data applications.

Performance is the driver of the move to in-memory. Accessing data in a server’s RAM (Random Access Memory) is about 100,000 times faster than accessing it from a hard disk. Since most trading applications – especially those implementing intelligent trading approaches – need to access some data to augment what might be contained in a price update, keeping that in RAM is going to reduce tick-to-trade latency.

Truth be told, in-memory is nothing new. Just about all traditional disk-oriented databases – including those that came to SAP in its 2010 acquisition of Sybase, and financial markets-oriented offerings from Kx Systems – cache data in memory as part of a data retrieval process that is hard disk centric. HANA, though, is designed so that RAM is its primary data store, with hard disk (and non-volatile flash memory) for persistence.

Even as an in-memory database, HANA is also not that new, or unique. One of the earliest in-memory databases that enjoyed commercial success (especially in the financial markets) was the TimesTen offering, spun out of Hewlett-Packard in 1996 (and acquired by Oracle in 2005).

What is new is that technology advances have made in-memory approaches more usable, and cost effective. 64-bit processor architectures can now address much larger RAM memory address spaces, servers can now pack in many terabytes of RAM, network protocols like RDMA can now connect servers together with very low latency so in-memory can scale out, and – last but not least – RAM is getting cheaper, and cheaper.

SAP isn’t the only company with an in-memory offering, as several vendors have been turned on to its promise. Those include Software AG’s Terracotta unit (now under the stewardship of former Tibco Software exec Robin Gilthorpe), Tibco with ActiveSpaces, GigaSpaces Technologies with XAP, McObject’s eXtremeDB, GridGain, ScaleOut Software, and the new EMC/VMware Pivotal venture, which has sucked in GemFire as a key component.

Interestingly, in-memory is being explored not only by the low-latency world, but also by those looking to leverage big data approaches, such as Hadoop, and finding performance is an issue. Data warehouse vendor Teradata recently introduced its Intelligent Memory offering, which adds an in-memory component to its hard disk/flash product. Teradata determined that its users’ implementations made 90% of data queries on just 20% of the data stored, and uses algorithms to keep what it terms ‘hot’ data in RAM for fast retrieval.

This hot data approach to cutting big data down to (less) size underpins most of the in-memory offerings, including HANA, though the implementation differs from product to product, and is certainly a ‘devil in the detail’ aspect to be understood when implementing this technology. Watch this space …

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and reduce costs. Use cases of specific importance to the finance sector, such as data...

BLOG

Sustainable Trading to Wind Down, Leaving ESG Legacy for Trading Industry

Sustainable Trading, the non-profit membership network created to drive environmental, social, and governance (ESG) best practices within the financial trading industry, is set to close its doors. The decision, which comes just over three years after its high-profile launch, was confirmed following a board recommendation and subsequent member vote at an Extraordinary General Meeting on...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

MiFID II Handbook – Second Edition

With the compliance deadline for Markets in Financial Instruments Directive II (MiFID II) just over two months away, A-Team Group has updated its MiFID II handbook to bring you the latest details on the regulation’s compliance requirements. Version 2 of the handbook, commissioned by Thomson Reuters, also includes new sections covering data sourcing and data...