About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

When Is Lower Latency Worth The Effort?

Subscribe to our newsletter

Shaving response times by nanoseconds can produce value in high-frequency trading, but the cost of achieving that size of an improvement in latency, in resources and time, can be too high for trading of more complicated types of securities, according to low-latency services and market access platform providers.

“High frequency traders are responding at a level of 200 nanoseconds,” says David Snowdon, chief technology officer and co-founder of Metamako, a Sydney-based low latency technology company. “If you want to get it down to 190, 193 or 195 nanoseconds — get those last few nanoseconds out of the system, you have to measure very accurately what time events happen on your network, so you can then understand what your response time was.”

Firms also should look at variance in their response times around the 200 nanosecond level, according to Snowdon. “Being able to measure that variance is extremely important to exchanges, to guarantee that they’re providing fair access to the market,” he says.

While frontiers of speed can still be trimmed, as Snowdon states, having a certain level of speed and a certain low level of latency has become a given in the industry — and one that need not be improved upon, as Dan Hubscher, director of strategy at Object Trading, a direct market access platform, describes.

“Speed is still important in that for anyone who has a strategy that depended on speed, they can’t get slower and still be profitable,” he says. “They still have to maintain that minimum level. The problem for most traders is that they’ve reached a commercial limit, where it doesn’t pay. It doesn’t return dividends to get it any faster.”

Furthermore, trying to lower latency when dealing with asset classes other than equities requires clearing additional hurdles, according to Hubscher. “Latency arbitrage on multiple exchanges doesn’t really exist in futures,” he says. “Trading a wider array of products across many more geographies — different types of derivatives and asset classes — pushed the game into one of scale, bringing in cost control.

“When you’re scaling up to different destinations, especially if you still need some degree of low latency, managing pre-trade risk, positions and exposures … is harder if you’re constantly adding new things that aren’t familiar.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Data platform modernisation: Best practice approaches for unifying data, real time data and automated processing

Financial institutions are evolving their data platform modernisation programmes, moving beyond data-for-cloud capabilities and increasingly towards artificial intelligence-readiness. This has shifted the data management focus in the direction of data unification, real-time delivery and automated governance. The drivers of this transition are improved operational efficiency as manual processes are replaced by faster, more accurate automated...

BLOG

Financial Crime is a Decision-Speed Problem: Rethinking AI in AML and Compliance Controls

Financial crime compliance is often described as a resourcing challenge. Firms speak of analyst backlogs, alert volumes and the rising cost of surveillance and screening. Kieran Holland, Solutions Engineering Team Leader at Innovative Systems’ FinScan, argues that the underlying constraint has shifted. Financial crime has become a decision-speed problem. “The fight against financial crime is...

EVENT

TEST Event page 1

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Entity Data Management Handbook – Fourth Edition

Welcome to the fourth edition of A-Team Group’s Entity Data Management Handbook sponsored by entity data specialist Bureau van Dijk, a Moody’s Analytics company. As entity data takes a central role in business strategies dedicated to making the customer experience markedly better, this handbook delves into the detail of everything you need to do to...