The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Latency Monitoring Revs Up to Nanoseconds

“The Value of a Millisecond” – the title of a widely quoted April 2008 white paper by my esteemed industry colleague Larry Tabb – is now as obsolete a discussion as is the Renault F1 car that graced its cover. In the world of low latency – just as in F1 – three years of innovation has reset the performance metrics that matter. Winners, and losers, and all that.

Last week’s announcement by Corvil that it is enabling its customer Nomura to report latency to nanosecond accuracy highlighted the increasing need to monitor at this level of granularity. As trading firms’ execution platforms and messaging middleware offerings begin operating in the single-digit microsecond range, being ahead of that in terms of measurement is becoming an imperative. As I say above, winners and losers, and all that.

To recap from last week, Nomura has deployed Corvil in its equities DMA operations – its NXT Direct system – to monitor trade performance and to validate latency. NXT Direct has been operating at below three microseconds latency (so says Nomura, without getting at all granular on what that number actually means), and so latency measurement of individual components that comprise the platform, including how that latency varies, is now seemingly a matter of talk in nanoseconds.

As would be expected, Nomura’s platform is co-located with major US equity trading venues, and provides per-client, per venue analysis, including the ability to view latencies for single orders, acknowledgments, fills and all other order types. All good stuff to know should a client complain about performance being a microsecond or two on the slow side.

For Nomura, Corvil has deployed the latest version of its CorvilNet offering, which has seen a software upgrade to increase the resolution of measurement and to optimise performance by leveraging Intel multi-core technology, as already deployed in Corvil’s hardware platforms.

But while the software within CorvilNet can measure down to a single nanosecond, the hardware time stamping on the network interface card currently installed is 10 nanoseconds. So keeping up with the software will require a hardware upgrade in the future in order to produce more accurate timestamps.

“There has been an insatiable drive by our customers from milliseconds to microseconds and now to nanoseconds,” says Donal O’Sullivan, Corvil’s VP of product management. “Now with our latest release, Corvil customers can detect if someone inserts a 10m cable instead of a 5m cable by looking at the latency reports.”

[I am told that a metre of cable equates to three to four nanoseconds, so I think that claim flies.]

The focus on increased granularity also comes from the more general deployment of latest technologies, such as 10 gigabit Ethernet, InfiniBand and RMDA transports. These have been adopted by messaging middleware vendors such as IBM, Informatica/29West, NYSE Technologies, and now Tibco Software.

Recently published performance tests by IBM and Informatica show server-to-server latencies of single-digit microseconds (and Tibco’s imminent release of FTL is likely to compete on that level). Measurement of latency variances – so called jitter – across such middleware is going to require nanosecond measurement to make sense.

Coming soon, I would expect, will be latency measurement between applications running on the same server, and communicating via shared memory. A number of middleware offerings already support this, and reported latencies are in the few hundred nanosecond range – only to reduce with hardware advances.

TS-Associates’ Application Tap is an add-in server card that supports such latency measurement intra-server, albeit with some minor code changes, also down to 10 nanoseconds. “The New Paradigm of Nanometrics” is the name of a report from that company on its approach. Nice try, but I still prefer Larry’s, outdated as it is.

Related content

WEBINAR

Recorded Webinar: The evolution of market surveillance across sell-side and buy-side firms

Market surveillance is crucial, and in many cases a regulatory requirement, to ensuring orderly securities markets and sustaining confidence in trading. It can be breached and has become increasingly complex in the wake of the Covid pandemic, Brexit, and the emergence of new asset classes. This webinar will review the extent of market abuse in...

BLOG

TickSmith Partners with BMLL to Help Firms Monetise Their Data, and Integrates with Snowflake’s Data Cloud

Montreal-based TickSmith has announced a collaboration with BMLL, the data and analytics company, that will enable data producers, including exchanges, trading venues and data vendors, to generate new revenue streams by creating and distributing analytics derived from their data. TickSmith’s Enterprise Data Web Store allows data-rich firms to create their own customisable data web store,...

EVENT

RegTech Summit Virtual

The RegTech Summit Virtual is a global online event that brings together an exceptional guest speaker line up of RegTech practitioners, regulators, start-ups and solution providers to collaborate and discuss innovative and effective approaches for building a better regulatory environment.

GUIDE

Trading Regulations Handbook 2021

In these unprecedented times, a carefully crafted trading infrastructure is crucial for capital markets participants. Yet, the impact of trading regulations on infrastructure can be difficult to manage. The Trading Regulations Handbook 2021 can help. It provides all the essentials you need to know about regulations impacting trading operations, data and technology. A-Team Group’s Trading...