About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Latency Monitoring Revs Up to Nanoseconds

Subscribe to our newsletter

“The Value of a Millisecond” – the title of a widely quoted April 2008 white paper by my esteemed industry colleague Larry Tabb – is now as obsolete a discussion as is the Renault F1 car that graced its cover. In the world of low latency – just as in F1 – three years of innovation has reset the performance metrics that matter. Winners, and losers, and all that.

Last week’s announcement by Corvil that it is enabling its customer Nomura to report latency to nanosecond accuracy highlighted the increasing need to monitor at this level of granularity. As trading firms’ execution platforms and messaging middleware offerings begin operating in the single-digit microsecond range, being ahead of that in terms of measurement is becoming an imperative. As I say above, winners and losers, and all that.

To recap from last week, Nomura has deployed Corvil in its equities DMA operations – its NXT Direct system – to monitor trade performance and to validate latency. NXT Direct has been operating at below three microseconds latency (so says Nomura, without getting at all granular on what that number actually means), and so latency measurement of individual components that comprise the platform, including how that latency varies, is now seemingly a matter of talk in nanoseconds.

As would be expected, Nomura’s platform is co-located with major US equity trading venues, and provides per-client, per venue analysis, including the ability to view latencies for single orders, acknowledgments, fills and all other order types. All good stuff to know should a client complain about performance being a microsecond or two on the slow side.

For Nomura, Corvil has deployed the latest version of its CorvilNet offering, which has seen a software upgrade to increase the resolution of measurement and to optimise performance by leveraging Intel multi-core technology, as already deployed in Corvil’s hardware platforms.

But while the software within CorvilNet can measure down to a single nanosecond, the hardware time stamping on the network interface card currently installed is 10 nanoseconds. So keeping up with the software will require a hardware upgrade in the future in order to produce more accurate timestamps.

“There has been an insatiable drive by our customers from milliseconds to microseconds and now to nanoseconds,” says Donal O’Sullivan, Corvil’s VP of product management. “Now with our latest release, Corvil customers can detect if someone inserts a 10m cable instead of a 5m cable by looking at the latency reports.”

[I am told that a metre of cable equates to three to four nanoseconds, so I think that claim flies.]

The focus on increased granularity also comes from the more general deployment of latest technologies, such as 10 gigabit Ethernet, InfiniBand and RMDA transports. These have been adopted by messaging middleware vendors such as IBM, Informatica/29West, NYSE Technologies, and now Tibco Software.

Recently published performance tests by IBM and Informatica show server-to-server latencies of single-digit microseconds (and Tibco’s imminent release of FTL is likely to compete on that level). Measurement of latency variances – so called jitter – across such middleware is going to require nanosecond measurement to make sense.

Coming soon, I would expect, will be latency measurement between applications running on the same server, and communicating via shared memory. A number of middleware offerings already support this, and reported latencies are in the few hundred nanosecond range – only to reduce with hardware advances.

TS-Associates’ Application Tap is an add-in server card that supports such latency measurement intra-server, albeit with some minor code changes, also down to 10 nanoseconds. “The New Paradigm of Nanometrics” is the name of a report from that company on its approach. Nice try, but I still prefer Larry’s, outdated as it is.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Market data in the cloud

Over the past several years, the topic of market data in the cloud has been hotly debated – latency has been an issue, which data to put in the cloud has been discussed, and lines have been drawn. But where are we now, and how have the lines been redrawn? This webinar will consider progress...

BLOG

FCA Publishes Trade Data Review and Launches Wholesale Data Market Study

The UK’s Financial Conduct Authority (FCA) has published a new report suggesting that the market for trade data in the UK – both pre-trade and post-trade – currently does not work as effectively as it could. The report is the outcome of a review conducted by the FCA that combined qualitative information on the business...

EVENT

RegTech Summit New York

Now in its 6th year, the RegTech Summit in New York will bring together the regtech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

ESG Handbook 2023

The ESG Handbook 2023 edition is the essential guide to everything you need to know about ESG and how to manage requirements if you work in financial data and technology. Download your free copy to understand: What ESG Covers: The scope and definition of ESG Regulations: The evolution of global regulations, especially in the UK...