About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Summit Report – Latency Measurement Vendors – Nasdaq Wants You!

Subscribe to our newsletter

If you work for a vendor of latency measurement technology, then you should have attended the Low-Latency Summit in New York City the other week. During the pre-lunch keynote, Nasdaq OMX principal technologist Dominick Paniscotti outlined efforts at the exchange group to measure – and so reduce – latency, and also put out a call for “innovative latency management products” to help them in those endeavors.

On Paniscotti’s wish list are products offering:

– High precision time stamping of network packets.
– Real-time ultra-low latency aggregation devices.
– Flexible/programmable correlation algorithms.
– Standardisation of network capture device protocols … allowing interoperability between different vendors.
– Lower cost per message capture and linear scalability.

Paniscotti noted during the presentation that Nasdaq uses a combination of home-grown and vendor products to measure latency. In the latter category, it has deployed latency monitoring from both Corvil and Correlix to provide independent latency statistics for various deployments around the world.

Importantly, he noted how latency has been reduced at Nasdaq in recent times. Specifically, at the beginning of 2009, average latency on Nasdaq was 600 microseconds, whereas today, around 70 microseconds is common.

But he emphasised that the challenge is not just in reducing latency, but also jitter, to ensure higher determinism. Whereas the exchange has focused in the past on average latencies, in the future it plans to focus much more on the outlier statistics that they experience, to reduce those too.

Jitter, he noted, can be introduced by many different components of a system, and that the reduction of it has driven technology innovation. For example, he noted, a network switch that was considered a good performer just 18 months ago, would now be considered a bad one.

Paniscotti also commented that latency measurement tools need to keep pace with overall latency and jitter reduction, and not be a source of either. Whereas in the past, software taps and tools such as tcpdump were commonly used, they are no longer accurate enough or introduce latency themselves.

And there are additional challenges. Syncronisation of timestamps across distributed systems is a must have, and often the cause of incorrect measurements. There’s also the need to store latency information for long periods for historic analysis. A Big Data problem in a low-latency world.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The future of market data – Harnessing cloud and AI for market data distribution and consumption

Market data is the lifeblood of trading, but as data volumes grow and real-time demands increase, traditional approaches to distribution and consumption are being pushed to their limits. Cloud technology and AI-driven solutions are rapidly transforming how financial institutions manage, process, and extract value from market data, offering greater scalability, efficiency, and intelligence. This webinar,...

BLOG

B2BROKER Launches Turnkey Solution for Liquidity Providers

B2BROKER, the liquidity and multi-asset technology provider, has introduced Liquidity Provider Turnkey, a new solution aimed at simplifying entry into the Prime-of-Prime (PoP) liquidity space for financial institutions. While turnkey offerings for retail brokerages are well-established, the infrastructure for setting up a liquidity provider business has remained elusive. B2BROKER’s solution addresses this gap by offering...

EVENT

Eagle Alpha Alternative Data Conference, London, hosted by A-Team Group

Now in its 8th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

Hosted/Managed Services

The on-site data management model is broken. Resources have been squeezed to breaking point. The industry needs a new operating model if it is truly to do more with less. Can hosted/managed services provide the answer? Can the marketplace really create and maintain a utility-based approach to reference data management? And if so, how can...