About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Beware Latency Monitoring on the Cheap

Subscribe to our newsletter

Interesting to read a couple of reports of late regarding a “breakthrough” in the world of network latency monitoring – apparently boffins at a couple of universities have come up with an inexpensive way to measure network delays at the tens of microsecond level – and they reckon that Wall Street is going to be very interested in what they’re up to. I guess I am always a bit wary of doing things on the cheap … you get what you pay for, as the saying goes.

The Lossy Difference Aggregator, or LDA, is the subject of work carried out at the University of California, San Diego and Purdue University in Indiana, and proposes adding functionality to network routers to provide a “good estimate” by sampling arrival and departure times of packets flowing through a router. For you geeks, some more detail is here.

I started wondering whether this approach was likely to one day impact the business of those vendors pushing passive approaches to latency monitoring to the financial markets – companies like Correlix, Corvil, Trading Systems Associates and Endace (though the latter appears to have re-focused away to other verticals). And while I think this research has merit, I don’t think LDA is ready for Wall Street. Because while it’s good, it’s not quite good enough.

For one thing, current passive approaches – and yes they cost a bit – can analyse delays in the nanosecond range – tens to hundreds, depending on what’s being measured. And another factor, pointed out by TS-A’s Henry Young, is that LDA just measures network delays at the Data Link Layer (Layer 2), and so does not not take into account reliable messaging protocols that fall into the Transport Layer (Layer 4) or latencies that occur higher in the ‘stack’ – which for trading applications is probably where most latency (and related jitter) occurs.

I can see LDA being useful for some general purpose router diagnostic, and Cisco Systems, which provided a grant to part fund this research, might look to do that one day. But methinks they won’t be pushing this for more serious latency monitoring. For that, they’re more likely to recommend Corvil, in which they have an equity stake.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

From Broker Bias to Independent Insight: The Case for Cloud-Native TCA

For years, the path of least resistance for buy-side transaction cost analysis (TCA) was simple: let the broker do it. Historically, asset managers have relied on their execution counterparties to provide post-trade reporting. It was a workflow of convenience. Brokers executed the trades and subsequently provided the analysis on how well they performed. However, this...

EVENT

TradingTech Summit New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...