About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Q&A – From Yesterday’s London Low-Latency Summit

Subscribe to our newsletter

Industry experts in the morning panels at yesterday’s Low-Latency Summit in London responded to a number of questions posed by moderator Pete Harris of IntelligentTradingTechnology.com. Here is some of the wisdom shared …

Q: Trading firms want to be smarter, leveraging technology to be wiser with respect to their trading strategies.  Do you agree?  Where is the focus?

Frederic Ponzo, GreySpark Partners: The latency numbers are getting smaller, but the constants are still there.  And it’s across asset classes now.

Dan Solak, Thomson Reuters: People are looking for the best way, rather than simply the fastest way.  Today, the biggest requirement is normalisation, once you get beyond the single vendor environment, you have to normalise those services.

Louis Lovas, OneMarketData: If you look at the extremes, Warren Buffett vs Getco and Knight, you have either end of the spectrum there with most people somewhere in between.

James Andrews, Informatica: Connectivity to and from the markets is less important than it was.  We are seeing push to other, ancillary areas; a push into back and middle office.  Also, firms are no longer seeking lowest latency at any cost.  Stability is way more important than raw speed.

Q: FX is a big, fluid market, but one that’s traditionally thought of as less latency sensitive.  But I see Redline has FPGA systems for FX.  So, why?

Ponzo: It’s true to say that the most effort in low latency reduction today is in the foreign exchange market.  It’s a simple market. Anyone can price an FX pair and do it quickly.  The other reason is that it’s highly liquid, which means spreads are tight.  If you’re out of the market, you’ll get picked off right away.

Lovas: If you can get access to the data, you can make faster trading decisions.

Andrews: We are seeing massive low latency growth in the FX area. There is spend on co-lo and on FPGAs.  They’re at the cutting edge.

Solak: Remember there is a correlation with equities.  Multi-venue, multi-country means multi-currency trading.

Q: What’s the impact of jitter on trading algos?

Benjamin Stephens, Nomura: We route institutional orders to exchanges.  With a passive order, if we have jitter, we may post an order to an exchange that’s not receiving orders as expected.  But if we’re talking about aggressive orders, there is more impact.  We may miss a fill.

We need systems to identify root cause of slippage.  Monitoring our network has been the biggest step toward identifying problems.  The next step is to optimise ourfeed handlers.

David Quarrell, OnX: We need a holistic view of network and servers.  Only then can you understand what’s causing slippage.  So there’s a lot of behaviour to analyse.

Mohammad Darwish, AdvancedIO: Focus on network to application.  You need low latency and sustained high bandwidth.  You need to know when events are going to happen.

Henry Young, TS-Associates: Impact of jitter as we see it is most prevalent among market-makers in the U.S. markets.  They are shifting huge amounts of liquidity around and they need to hit as fast as possible. To them, jitter equals risk.

Q: So what are firms doing to address jitter?  What more could be done?

Darwish: You want to minimise latency, and you want to sustain bandwidth.  The operating system in financial markets tends to be Linux or Windows whereas in the military it’s VXWorks, which is a true real-time O/S.

Quarrell: When reducing latency, don’t go to your IT department. Build stack from vendor offerings.  But remember you’re a bank, a trader.  Not a technologist.  So you need outside help.

In a FIX benchmarking test we looked at a distribution of 2 million messages.  Commercial engines generated a bell curve distribution. An open source distribution looked like a broken molar.  Very uneven.  That said, 300 microseconds latency is not bad.

Stephens: We’ve been running FPGA projects for three years.  We look at how to move as much processing as possible out of the critical path.  We use FPGA for risk checks in low latency, and we’re about to start processing FIX that way.  We’re moving away from Java for low-latency checks.

We are focusing on making the system scalable.  We can save $500K a year on one exchange connection by optimising capacity. If we have a dedicated host, a dedicated rack in a co-lo facility, we can’t justify that for a single client for performance.  Maybe for a group of clients we can share the infrastructure and justify it.  But we really want to scale our low-latency platform to all of Nomura’s trading globally.  That cuts down on the number of developers, ports, co-location facilities, exchange gateways.  As for our IT department, well we upgraded our IT department!

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Navigating the Build vs Buy Dilemma: Cloud Strategies for Accelerating Quantitative Research

Date: 20 May 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes For many quantitative trading firms and asset managers, building a self-provisioned historical market data environment remains one of the most time-consuming and resource-intensive steps in establishing a new research capability. Sourcing data, normalising symbologies, handling corporate actions and maintaining...

BLOG

Data Automator Xceptor Offers Platform Ready-Made for AI

Dan Reid is not surprised that Xceptor, the data automation giant he formed two decades ago, finds itself at the vanguard of a change in the way financial institutions regard and use documents. The rapid and accurate parsing of information from paper- and PDF-based reports has been made possible thanks to recent developments in artificial intelligence. The volume...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Enterprise Data Management, 2010 Edition

The global regulatory community has become increasingly aware of the data management challenge within financial institutions, as it struggles with its own challenge of better tracking systemic risk across financial markets. The US regulator in particular is seemingly keen to kick off a standardisation process and also wants the regulatory community to begin collecting additional...