A-Team Insight Blogs

When Is Lower Latency Worth The Effort?

Shaving response times by nanoseconds can produce value in high-frequency trading, but the cost of achieving that size of an improvement in latency, in resources and time, can be too high for trading of more complicated types of securities, according to low-latency services and market access platform providers.

“High frequency traders are responding at a level of 200 nanoseconds,” says David Snowdon, chief technology officer and co-founder of Metamako, a Sydney-based low latency technology company. “If you want to get it down to 190, 193 or 195 nanoseconds — get those last few nanoseconds out of the system, you have to measure very accurately what time events happen on your network, so you can then understand what your response time was.”

Firms also should look at variance in their response times around the 200 nanosecond level, according to Snowdon. “Being able to measure that variance is extremely important to exchanges, to guarantee that they’re providing fair access to the market,” he says.

While frontiers of speed can still be trimmed, as Snowdon states, having a certain level of speed and a certain low level of latency has become a given in the industry — and one that need not be improved upon, as Dan Hubscher, director of strategy at Object Trading, a direct market access platform, describes.

“Speed is still important in that for anyone who has a strategy that depended on speed, they can’t get slower and still be profitable,” he says. “They still have to maintain that minimum level. The problem for most traders is that they’ve reached a commercial limit, where it doesn’t pay. It doesn’t return dividends to get it any faster.”

Furthermore, trying to lower latency when dealing with asset classes other than equities requires clearing additional hurdles, according to Hubscher. “Latency arbitrage on multiple exchanges doesn’t really exist in futures,” he says. “Trading a wider array of products across many more geographies — different types of derivatives and asset classes — pushed the game into one of scale, bringing in cost control.

“When you’re scaling up to different destinations, especially if you still need some degree of low latency, managing pre-trade risk, positions and exposures … is harder if you’re constantly adding new things that aren’t familiar.”

Leave a comment

Your email address will not be published. Required fields are marked *

*

Share article

Related content

WEBINAR

Recorded Webinar: How to maximise data sources created by MiFID II

Don’t miss this opportunity to view the recording of this recently held webinar. Markets in Financial Instruments Directive II (MiFID II) creates new data sources that could be used to identify business opportunities and gain competitive edge. The sources include Approved Publication Arrangements (APA) and ESMA’s Financial Instruments Reference Data System (FIRDS). The regulation also...

BLOG

Quantifi’s Latest Whitepaper Explores the Challenges in Implementing a Counterparty Risk Management Process

Quantifi, a provider of analytics, trading and risk management solutions for the global capital markets, has published a whitepaper entitled ‘Challenges in Implementing a Counterparty Risk Management Process’. The paper explores the key challenges for banks in the implementation of counterparty risk management, focusing on data and technology issues, in the context of current trends...

GUIDE

Special Report: Regulation and Risk as Data Management Drivers

A-Team Group recently held a webinar on the topic of Regulation and Risk as Data Management Drivers. Fill in the form to get immediate access to the accompanying Special Report. Alongside death and taxes, perhaps the only other certainty in life is that regulation of the financial markets will increase in future years. How do...