About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Summit Report – Latency Measurement Vendors – Nasdaq Wants You!

Subscribe to our newsletter

If you work for a vendor of latency measurement technology, then you should have attended the Low-Latency Summit in New York City the other week. During the pre-lunch keynote, Nasdaq OMX principal technologist Dominick Paniscotti outlined efforts at the exchange group to measure – and so reduce – latency, and also put out a call for “innovative latency management products” to help them in those endeavors.

On Paniscotti’s wish list are products offering:

– High precision time stamping of network packets.
– Real-time ultra-low latency aggregation devices.
– Flexible/programmable correlation algorithms.
– Standardisation of network capture device protocols … allowing interoperability between different vendors.
– Lower cost per message capture and linear scalability.

Paniscotti noted during the presentation that Nasdaq uses a combination of home-grown and vendor products to measure latency. In the latter category, it has deployed latency monitoring from both Corvil and Correlix to provide independent latency statistics for various deployments around the world.

Importantly, he noted how latency has been reduced at Nasdaq in recent times. Specifically, at the beginning of 2009, average latency on Nasdaq was 600 microseconds, whereas today, around 70 microseconds is common.

But he emphasised that the challenge is not just in reducing latency, but also jitter, to ensure higher determinism. Whereas the exchange has focused in the past on average latencies, in the future it plans to focus much more on the outlier statistics that they experience, to reduce those too.

Jitter, he noted, can be introduced by many different components of a system, and that the reduction of it has driven technology innovation. For example, he noted, a network switch that was considered a good performer just 18 months ago, would now be considered a bad one.

Paniscotti also commented that latency measurement tools need to keep pace with overall latency and jitter reduction, and not be a source of either. Whereas in the past, software taps and tools such as tcpdump were commonly used, they are no longer accurate enough or introduce latency themselves.

And there are additional challenges. Syncronisation of timestamps across distributed systems is a must have, and often the cause of incorrect measurements. There’s also the need to store latency information for long periods for historic analysis. A Big Data problem in a low-latency world.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and reduce costs. Use cases of specific importance to the finance sector, such as data...

BLOG

Bloomberg Enhances RMS Enterprise to Unlock Proprietary Models and Strengthen Research Oversight

Bloomberg has announced significant enhancements to its enterprise-level Research Management Solution (RMS Enterprise), introducing two new capabilities: Custom Fundamentals and Digest Alerts. The updates are designed to address long-standing data interoperability challenges within investment firms, allowing research teams to better integrate proprietary financial models into their workflows and strengthen oversight across their organisations. For many...

EVENT

AI in Data Management Summit New York City

Following the success of the 15th Data Management Summit NYC, A-Team Group are excited to announce our new event: AI in Data Management Summit NYC!

GUIDE

Evaluated Pricing

Valuations and pricing teams are facing a much higher degree of scrutiny from both the regulatory community and the investor community in the glare of the post-crisis data transparency spotlight. Fair value price transparency requirements and the gradual move towards a more harmonised accounting standards environment is set within the context of the whole debate...