About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Beware Latency Monitoring on the Cheap

Subscribe to our newsletter

Interesting to read a couple of reports of late regarding a “breakthrough” in the world of network latency monitoring – apparently boffins at a couple of universities have come up with an inexpensive way to measure network delays at the tens of microsecond level – and they reckon that Wall Street is going to be very interested in what they’re up to. I guess I am always a bit wary of doing things on the cheap … you get what you pay for, as the saying goes.

The Lossy Difference Aggregator, or LDA, is the subject of work carried out at the University of California, San Diego and Purdue University in Indiana, and proposes adding functionality to network routers to provide a “good estimate” by sampling arrival and departure times of packets flowing through a router. For you geeks, some more detail is here.

I started wondering whether this approach was likely to one day impact the business of those vendors pushing passive approaches to latency monitoring to the financial markets – companies like Correlix, Corvil, Trading Systems Associates and Endace (though the latter appears to have re-focused away to other verticals). And while I think this research has merit, I don’t think LDA is ready for Wall Street. Because while it’s good, it’s not quite good enough.

For one thing, current passive approaches – and yes they cost a bit – can analyse delays in the nanosecond range – tens to hundreds, depending on what’s being measured. And another factor, pointed out by TS-A’s Henry Young, is that LDA just measures network delays at the Data Link Layer (Layer 2), and so does not not take into account reliable messaging protocols that fall into the Transport Layer (Layer 4) or latencies that occur higher in the ‘stack’ – which for trading applications is probably where most latency (and related jitter) occurs.

I can see LDA being useful for some general purpose router diagnostic, and Cisco Systems, which provided a grant to part fund this research, might look to do that one day. But methinks they won’t be pushing this for more serious latency monitoring. For that, they’re more likely to recommend Corvil, in which they have an equity stake.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Reviewing the Latency Landscape and the Next Generation of Ultra-Low Latency Infrastructure

Date: 17 September 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Ultra-low latency is no longer the preserve of a handful of proprietary trading firms. As new asset classes electronify, data volumes surge, and regulatory expectations around execution quality and resilience tighten, the performance demands on trading infrastructure are broadening...

BLOG

Broadridge Deepens AI Push with Minority Investment in DeepSee to Transform Post-Trade Operations

Broadridge Financial Solutions has taken a minority stake in agentic AI specialist DeepSee and expanded its partnership to embed intelligent automation into post-trade workflows, marking a strategic advance in its data and AI roadmap for capital markets operations. Tom Carey, President of Broadridge Global Technology and Operations (GTO), will join DeepSee’s Board of Directors as...

EVENT

ExchangeTech Summit London

A-Team Group, organisers of the TradingTech Summits, are pleased to announce the inaugural ExchangeTech Summit London on May 14th 2026. This dedicated forum brings together operators of exchanges, alternative execution venues and digital asset platforms with the ecosystem of vendors driving the future of matching engines, surveillance and market access.

GUIDE

Hosted/Managed Services

The on-site data management model is broken. Resources have been squeezed to breaking point. The industry needs a new operating model if it is truly to do more with less. Can hosted/managed services provide the answer? Can the marketplace really create and maintain a utility-based approach to reference data management? And if so, how can...