About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

When Is Lower Latency Worth The Effort?

Subscribe to our newsletter

Shaving response times by nanoseconds can produce value in high-frequency trading, but the cost of achieving that size of an improvement in latency, in resources and time, can be too high for trading of more complicated types of securities, according to low-latency services and market access platform providers.

“High frequency traders are responding at a level of 200 nanoseconds,” says David Snowdon, chief technology officer and co-founder of Metamako, a Sydney-based low latency technology company. “If you want to get it down to 190, 193 or 195 nanoseconds — get those last few nanoseconds out of the system, you have to measure very accurately what time events happen on your network, so you can then understand what your response time was.”

Firms also should look at variance in their response times around the 200 nanosecond level, according to Snowdon. “Being able to measure that variance is extremely important to exchanges, to guarantee that they’re providing fair access to the market,” he says.

While frontiers of speed can still be trimmed, as Snowdon states, having a certain level of speed and a certain low level of latency has become a given in the industry — and one that need not be improved upon, as Dan Hubscher, director of strategy at Object Trading, a direct market access platform, describes.

“Speed is still important in that for anyone who has a strategy that depended on speed, they can’t get slower and still be profitable,” he says. “They still have to maintain that minimum level. The problem for most traders is that they’ve reached a commercial limit, where it doesn’t pay. It doesn’t return dividends to get it any faster.”

Furthermore, trying to lower latency when dealing with asset classes other than equities requires clearing additional hurdles, according to Hubscher. “Latency arbitrage on multiple exchanges doesn’t really exist in futures,” he says. “Trading a wider array of products across many more geographies — different types of derivatives and asset classes — pushed the game into one of scale, bringing in cost control.

“When you’re scaling up to different destinations, especially if you still need some degree of low latency, managing pre-trade risk, positions and exposures … is harder if you’re constantly adding new things that aren’t familiar.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Navigating the Build vs Buy Dilemma: Cloud Strategies for Accelerating Quantitative Research

Date: 20 May 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes For many quantitative trading firms and asset managers, building a self-provisioned historical market data environment remains one of the most time-consuming and resource-intensive steps in establishing a new research capability. Sourcing data, normalising symbologies, handling corporate actions and maintaining...

BLOG

Infrastructure Modernisation, Intelligent Workflows, Data Strategy and More: A Preview of TradingTech Summit London 2026

The conversation around trading technology has become more exacting over the past year. AI is moving into production environments. Data estates are being rationalised and rebuilt. Infrastructure decisions are increasingly shaped by resilience, transparency and regulatory pressure. Against that backdrop, A-Team Group’s TradingTech Summit London 2026 takes place at a time when firms are reassessing...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2019/2020 – Seventh Edition

Welcome to A-Team Group’s best read handbook, the Regulatory Data Handbook, which is now in its seventh edition and continues to grow in terms of the number of regulations covered, the detail of each regulation and the impact that all the rules and regulations will have on data and data management at your institution. This...