About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Beware Latency Monitoring on the Cheap

Subscribe to our newsletter

Interesting to read a couple of reports of late regarding a “breakthrough” in the world of network latency monitoring – apparently boffins at a couple of universities have come up with an inexpensive way to measure network delays at the tens of microsecond level – and they reckon that Wall Street is going to be very interested in what they’re up to. I guess I am always a bit wary of doing things on the cheap … you get what you pay for, as the saying goes.

The Lossy Difference Aggregator, or LDA, is the subject of work carried out at the University of California, San Diego and Purdue University in Indiana, and proposes adding functionality to network routers to provide a “good estimate” by sampling arrival and departure times of packets flowing through a router. For you geeks, some more detail is here.

I started wondering whether this approach was likely to one day impact the business of those vendors pushing passive approaches to latency monitoring to the financial markets – companies like Correlix, Corvil, Trading Systems Associates and Endace (though the latter appears to have re-focused away to other verticals). And while I think this research has merit, I don’t think LDA is ready for Wall Street. Because while it’s good, it’s not quite good enough.

For one thing, current passive approaches – and yes they cost a bit – can analyse delays in the nanosecond range – tens to hundreds, depending on what’s being measured. And another factor, pointed out by TS-A’s Henry Young, is that LDA just measures network delays at the Data Link Layer (Layer 2), and so does not not take into account reliable messaging protocols that fall into the Transport Layer (Layer 4) or latencies that occur higher in the ‘stack’ – which for trading applications is probably where most latency (and related jitter) occurs.

I can see LDA being useful for some general purpose router diagnostic, and Cisco Systems, which provided a grant to part fund this research, might look to do that one day. But methinks they won’t be pushing this for more serious latency monitoring. For that, they’re more likely to recommend Corvil, in which they have an equity stake.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Data platform modernisation: Best practice approaches for unifying data, real time data and automated processing

Financial institutions are evolving their data platform modernisation programmes, moving beyond data-for-cloud capabilities and increasingly towards artificial intelligence-readiness. This has shifted the data management focus in the direction of data unification, real-time delivery and automated governance. The drivers of this transition are improved operational efficiency as manual processes are replaced by faster, more accurate automated...

BLOG

Broadridge Deepens AI Push with Minority Investment in DeepSee to Transform Post-Trade Operations

Broadridge Financial Solutions has taken a minority stake in agentic AI specialist DeepSee and expanded its partnership to embed intelligent automation into post-trade workflows, marking a strategic advance in its data and AI roadmap for capital markets operations. Tom Carey, President of Broadridge Global Technology and Operations (GTO), will join DeepSee’s Board of Directors as...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...