About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Latency Monitoring Revs Up to Nanoseconds

Subscribe to our newsletter

“The Value of a Millisecond” – the title of a widely quoted April 2008 white paper by my esteemed industry colleague Larry Tabb – is now as obsolete a discussion as is the Renault F1 car that graced its cover. In the world of low latency – just as in F1 – three years of innovation has reset the performance metrics that matter. Winners, and losers, and all that.

Last week’s announcement by Corvil that it is enabling its customer Nomura to report latency to nanosecond accuracy highlighted the increasing need to monitor at this level of granularity. As trading firms’ execution platforms and messaging middleware offerings begin operating in the single-digit microsecond range, being ahead of that in terms of measurement is becoming an imperative. As I say above, winners and losers, and all that.

To recap from last week, Nomura has deployed Corvil in its equities DMA operations – its NXT Direct system – to monitor trade performance and to validate latency. NXT Direct has been operating at below three microseconds latency (so says Nomura, without getting at all granular on what that number actually means), and so latency measurement of individual components that comprise the platform, including how that latency varies, is now seemingly a matter of talk in nanoseconds.

As would be expected, Nomura’s platform is co-located with major US equity trading venues, and provides per-client, per venue analysis, including the ability to view latencies for single orders, acknowledgments, fills and all other order types. All good stuff to know should a client complain about performance being a microsecond or two on the slow side.

For Nomura, Corvil has deployed the latest version of its CorvilNet offering, which has seen a software upgrade to increase the resolution of measurement and to optimise performance by leveraging Intel multi-core technology, as already deployed in Corvil’s hardware platforms.

But while the software within CorvilNet can measure down to a single nanosecond, the hardware time stamping on the network interface card currently installed is 10 nanoseconds. So keeping up with the software will require a hardware upgrade in the future in order to produce more accurate timestamps.

“There has been an insatiable drive by our customers from milliseconds to microseconds and now to nanoseconds,” says Donal O’Sullivan, Corvil’s VP of product management. “Now with our latest release, Corvil customers can detect if someone inserts a 10m cable instead of a 5m cable by looking at the latency reports.”

[I am told that a metre of cable equates to three to four nanoseconds, so I think that claim flies.]

The focus on increased granularity also comes from the more general deployment of latest technologies, such as 10 gigabit Ethernet, InfiniBand and RMDA transports. These have been adopted by messaging middleware vendors such as IBM, Informatica/29West, NYSE Technologies, and now Tibco Software.

Recently published performance tests by IBM and Informatica show server-to-server latencies of single-digit microseconds (and Tibco’s imminent release of FTL is likely to compete on that level). Measurement of latency variances – so called jitter – across such middleware is going to require nanosecond measurement to make sense.

Coming soon, I would expect, will be latency measurement between applications running on the same server, and communicating via shared memory. A number of middleware offerings already support this, and reported latencies are in the few hundred nanosecond range – only to reduce with hardware advances.

TS-Associates’ Application Tap is an add-in server card that supports such latency measurement intra-server, albeit with some minor code changes, also down to 10 nanoseconds. “The New Paradigm of Nanometrics” is the name of a report from that company on its approach. Nice try, but I still prefer Larry’s, outdated as it is.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and reduce costs. Use cases of specific importance to the finance sector, such as data...

BLOG

BMLL Frees Quants from Data Prep with New Trades Plus Offering

Data and analytics provider BMLL has launched Trades Plus, a new equities dataset designed to eliminate the complex and time-consuming process of combining trade and quote data, a significant and resource-intensive challenge for quantitative analysts and trading firms. The new offering, developed in direct response to requests from its Client Product Advisory Board (CPAB), provides...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

BCBS 239 Data Management Handbook

Our 2015/2016 edition of the BCBS 239 Data Management Handbook has arrived! Printed copies went like hotcakes at our Data Management Summit in New York but you can download your own copy here and get access to detailed information on the  principles and implications of BCBS 239 on Data Management. This Handbook provides an at-a-glance...