About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Latency – All About A, B and C (for Compute)

Subscribe to our newsletter

I often describe latency as the time it takes to move data from point A to point B, and/or the time taken to process that data at points A and B. I think it’s true to say that the majority of content on this site is about moving data from A to B. But processing data – the C, or compute element, of latency is increasingly a focus.

The computing in low latency processing takes many forms.  It can be related to data manipulation, such as the conversion of message formats; or data management, such as working with a time series database; or numerical processing, such a calculating an order price or size.

With the latency related to moving data – propagation latency – well understood, increasingly a focus for architects and developers is the latency related to trading applications, and minimising this compute element is very much the goal of this activity.

Tackling this application latency is very much a requirement for “Intelligent Trading” – making the right trade in a timely manner, though not always being the fastest.

Reducing application latency is not just about software. The hardware platform upon which applications run play a crucial role, even though the software geeks often wince at solving a challenge through faster hardware.

As an example, recent news from DataDirect Networks related to its STAC-M3 benchmark, involving processing of tick histories managed by Kx Systems’ kdb+ database running against its SFA12K-40 hybrid flash/spinning disk ‘Big Data’ platform, demonstrates the role of hardware in directly boosting application performance.

We’ll be covering this topic increasingly within the Low-Latency.com community. It will also be a big focus on our May 1 Low-Latency Summit, taking place in New York City.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Navigating the Build vs Buy Dilemma: Cloud Strategies for Accelerating Quantitative Research

Date: 20 May 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes For many quantitative trading firms and asset managers, building a self-provisioned historical market data environment remains one of the most time-consuming and resource-intensive steps in establishing a new research capability. Sourcing data, normalising symbologies, handling corporate actions and maintaining...

BLOG

Introducing Market & Alt Data Insight: Advancing the Industrialisation of Data in Financial Markets

Financial markets are entering a new phase in the evolution of data. Data has always underpinned trading and investment workflows. What has changed is the scale, diversity and strategic management of that data across the enterprise. Traditional market data, alternative signals, derived datasets and AI-generated features now sit on the same operational continuum. The strategic...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Evaluated Pricing

Valuations and pricing teams are facing a much higher degree of scrutiny from both the regulatory community and the investor community in the glare of the post-crisis data transparency spotlight. Fair value price transparency requirements and the gradual move towards a more harmonised accounting standards environment is set within the context of the whole debate...