About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Beware Latency Monitoring on the Cheap

Subscribe to our newsletter

Interesting to read a couple of reports of late regarding a “breakthrough” in the world of network latency monitoring – apparently boffins at a couple of universities have come up with an inexpensive way to measure network delays at the tens of microsecond level – and they reckon that Wall Street is going to be very interested in what they’re up to. I guess I am always a bit wary of doing things on the cheap … you get what you pay for, as the saying goes.

The Lossy Difference Aggregator, or LDA, is the subject of work carried out at the University of California, San Diego and Purdue University in Indiana, and proposes adding functionality to network routers to provide a “good estimate” by sampling arrival and departure times of packets flowing through a router. For you geeks, some more detail is here.

I started wondering whether this approach was likely to one day impact the business of those vendors pushing passive approaches to latency monitoring to the financial markets – companies like Correlix, Corvil, Trading Systems Associates and Endace (though the latter appears to have re-focused away to other verticals). And while I think this research has merit, I don’t think LDA is ready for Wall Street. Because while it’s good, it’s not quite good enough.

For one thing, current passive approaches – and yes they cost a bit – can analyse delays in the nanosecond range – tens to hundreds, depending on what’s being measured. And another factor, pointed out by TS-A’s Henry Young, is that LDA just measures network delays at the Data Link Layer (Layer 2), and so does not not take into account reliable messaging protocols that fall into the Transport Layer (Layer 4) or latencies that occur higher in the ‘stack’ – which for trading applications is probably where most latency (and related jitter) occurs.

I can see LDA being useful for some general purpose router diagnostic, and Cisco Systems, which provided a grant to part fund this research, might look to do that one day. But methinks they won’t be pushing this for more serious latency monitoring. For that, they’re more likely to recommend Corvil, in which they have an equity stake.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Navigating the Build vs Buy Dilemma: Cloud Strategies for Accelerating Quantitative Research

Date: 20 May 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes For many quantitative trading firms and asset managers, building a self-provisioned historical market data environment remains one of the most time-consuming and resource-intensive steps in establishing a new research capability. Sourcing data, normalising symbologies, handling corporate actions and maintaining...

BLOG

LSEG and Bank of America Target AI-ready, Governed Data Integration in Multi-Year Partnership

London Stock Exchange Group (LSEG) and Bank of America have agreed a multi-year strategic partnership centred on embedding governed, AI-ready data and analytics directly into the bank’s core workflows. Rather than a distribution agreement focused on access, the collaboration reflects a broader architectural shift: integrating unified, rights-cleared content, analytics and risk intelligence across advisory, trading,...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Practical Applications of the Global LEI – Client On-Boarding and Beyond

The time for talking is over. The time for action is now. A bit melodramatic, perhaps, but given last month’s official launch of the global legal entity identifier (LEI) standard, practitioners are rolling up their sleeves and getting on with figuring out how to incorporate the new identifier into their customer and entity data infrastructures....