About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Latency – All About A, B and C (for Compute)

Subscribe to our newsletter

I often describe latency as the time it takes to move data from point A to point B, and/or the time taken to process that data at points A and B. I think it’s true to say that the majority of content on this site is about moving data from A to B. But processing data – the C, or compute element, of latency is increasingly a focus.

The computing in low latency processing takes many forms.  It can be related to data manipulation, such as the conversion of message formats; or data management, such as working with a time series database; or numerical processing, such a calculating an order price or size.

With the latency related to moving data – propagation latency – well understood, increasingly a focus for architects and developers is the latency related to trading applications, and minimising this compute element is very much the goal of this activity.

Tackling this application latency is very much a requirement for “Intelligent Trading” – making the right trade in a timely manner, though not always being the fastest.

Reducing application latency is not just about software. The hardware platform upon which applications run play a crucial role, even though the software geeks often wince at solving a challenge through faster hardware.

As an example, recent news from DataDirect Networks related to its STAC-M3 benchmark, involving processing of tick histories managed by Kx Systems’ kdb+ database running against its SFA12K-40 hybrid flash/spinning disk ‘Big Data’ platform, demonstrates the role of hardware in directly boosting application performance.

We’ll be covering this topic increasingly within the Low-Latency.com community. It will also be a big focus on our May 1 Low-Latency Summit, taking place in New York City.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

DiffusionData Targets Agentic AI in Finance with New MCP Server

Data technology firm DiffusionData has released an open-source server designed to connect Large Language Models (LLMs) with real-time data streams, aiming to facilitate the development of Agentic AI in financial services. The new Diffusion MCP Server uses the Model Context Protocol (MCP), an open standard for AI models to interact with external tools and data...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Hosted/Managed Services

The on-site data management model is broken. Resources have been squeezed to breaking point. The industry needs a new operating model if it is truly to do more with less. Can hosted/managed services provide the answer? Can the marketplace really create and maintain a utility-based approach to reference data management? And if so, how can...