About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Summit Report – Latency Measurement Vendors – Nasdaq Wants You!

Subscribe to our newsletter

If you work for a vendor of latency measurement technology, then you should have attended the Low-Latency Summit in New York City the other week. During the pre-lunch keynote, Nasdaq OMX principal technologist Dominick Paniscotti outlined efforts at the exchange group to measure – and so reduce – latency, and also put out a call for “innovative latency management products” to help them in those endeavors.

On Paniscotti’s wish list are products offering:

– High precision time stamping of network packets.
– Real-time ultra-low latency aggregation devices.
– Flexible/programmable correlation algorithms.
– Standardisation of network capture device protocols … allowing interoperability between different vendors.
– Lower cost per message capture and linear scalability.

Paniscotti noted during the presentation that Nasdaq uses a combination of home-grown and vendor products to measure latency. In the latter category, it has deployed latency monitoring from both Corvil and Correlix to provide independent latency statistics for various deployments around the world.

Importantly, he noted how latency has been reduced at Nasdaq in recent times. Specifically, at the beginning of 2009, average latency on Nasdaq was 600 microseconds, whereas today, around 70 microseconds is common.

But he emphasised that the challenge is not just in reducing latency, but also jitter, to ensure higher determinism. Whereas the exchange has focused in the past on average latencies, in the future it plans to focus much more on the outlier statistics that they experience, to reduce those too.

Jitter, he noted, can be introduced by many different components of a system, and that the reduction of it has driven technology innovation. For example, he noted, a network switch that was considered a good performer just 18 months ago, would now be considered a bad one.

Paniscotti also commented that latency measurement tools need to keep pace with overall latency and jitter reduction, and not be a source of either. Whereas in the past, software taps and tools such as tcpdump were commonly used, they are no longer accurate enough or introduce latency themselves.

And there are additional challenges. Syncronisation of timestamps across distributed systems is a must have, and often the cause of incorrect measurements. There’s also the need to store latency information for long periods for historic analysis. A Big Data problem in a low-latency world.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Data platform modernisation: Best practice approaches for unifying data, real time data and automated processing

Date: 17 March 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Financial institutions are evolving their data platform modernisation programmes, moving beyond data-for-cloud capabilities and increasingly towards artificial intelligence-readiness. This has shifted the data management focus in the direction of data unification, real-time delivery and automated governance. The drivers of...

BLOG

Nasdaq and AWS Deepen Partnership to Offer Calypso as a Managed Service

Nasdaq and Amazon Web Services (AWS) have expanded their strategic partnership, announcing that Nasdaq’s Calypso platform will now be available as a fully managed service on AWS. The move aims to provide financial institutions with a modernised, resilient, and scalable infrastructure for their capital markets and treasury operations. The offering sees Nasdaq managing the underlying...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Putting the LEI into Practice

Hundreds of thousands of pre-Legal Entity Identifiers (LEIs) have been issued by pre-Local Operating Units (LOUs) in the Global LEI System (GLEIS), and the standard entity identifier has been mandated for use by regulators in both the US and Europe. As more pre-LEIs are issued ahead of the establishment of the global systems’ Central Operating...