About a-team Marketing Services

A-Team Insight Blogs

Q&A – From Yesterday’s London Low-Latency Summit

Subscribe to our newsletter

Industry experts in the morning panels at yesterday’s Low-Latency Summit in London responded to a number of questions posed by moderator Pete Harris of IntelligentTradingTechnology.com. Here is some of the wisdom shared …

Q: Trading firms want to be smarter, leveraging technology to be wiser with respect to their trading strategies.  Do you agree?  Where is the focus?

Frederic Ponzo, GreySpark Partners: The latency numbers are getting smaller, but the constants are still there.  And it’s across asset classes now.

Dan Solak, Thomson Reuters: People are looking for the best way, rather than simply the fastest way.  Today, the biggest requirement is normalisation, once you get beyond the single vendor environment, you have to normalise those services.

Louis Lovas, OneMarketData: If you look at the extremes, Warren Buffett vs Getco and Knight, you have either end of the spectrum there with most people somewhere in between.

James Andrews, Informatica: Connectivity to and from the markets is less important than it was.  We are seeing push to other, ancillary areas; a push into back and middle office.  Also, firms are no longer seeking lowest latency at any cost.  Stability is way more important than raw speed.

Q: FX is a big, fluid market, but one that’s traditionally thought of as less latency sensitive.  But I see Redline has FPGA systems for FX.  So, why?

Ponzo: It’s true to say that the most effort in low latency reduction today is in the foreign exchange market.  It’s a simple market. Anyone can price an FX pair and do it quickly.  The other reason is that it’s highly liquid, which means spreads are tight.  If you’re out of the market, you’ll get picked off right away.

Lovas: If you can get access to the data, you can make faster trading decisions.

Andrews: We are seeing massive low latency growth in the FX area. There is spend on co-lo and on FPGAs.  They’re at the cutting edge.

Solak: Remember there is a correlation with equities.  Multi-venue, multi-country means multi-currency trading.

Q: What’s the impact of jitter on trading algos?

Benjamin Stephens, Nomura: We route institutional orders to exchanges.  With a passive order, if we have jitter, we may post an order to an exchange that’s not receiving orders as expected.  But if we’re talking about aggressive orders, there is more impact.  We may miss a fill.

We need systems to identify root cause of slippage.  Monitoring our network has been the biggest step toward identifying problems.  The next step is to optimise ourfeed handlers.

David Quarrell, OnX: We need a holistic view of network and servers.  Only then can you understand what’s causing slippage.  So there’s a lot of behaviour to analyse.

Mohammad Darwish, AdvancedIO: Focus on network to application.  You need low latency and sustained high bandwidth.  You need to know when events are going to happen.

Henry Young, TS-Associates: Impact of jitter as we see it is most prevalent among market-makers in the U.S. markets.  They are shifting huge amounts of liquidity around and they need to hit as fast as possible. To them, jitter equals risk.

Q: So what are firms doing to address jitter?  What more could be done?

Darwish: You want to minimise latency, and you want to sustain bandwidth.  The operating system in financial markets tends to be Linux or Windows whereas in the military it’s VXWorks, which is a true real-time O/S.

Quarrell: When reducing latency, don’t go to your IT department. Build stack from vendor offerings.  But remember you’re a bank, a trader.  Not a technologist.  So you need outside help.

In a FIX benchmarking test we looked at a distribution of 2 million messages.  Commercial engines generated a bell curve distribution. An open source distribution looked like a broken molar.  Very uneven.  That said, 300 microseconds latency is not bad.

Stephens: We’ve been running FPGA projects for three years.  We look at how to move as much processing as possible out of the critical path.  We use FPGA for risk checks in low latency, and we’re about to start processing FIX that way.  We’re moving away from Java for low-latency checks.

We are focusing on making the system scalable.  We can save $500K a year on one exchange connection by optimising capacity. If we have a dedicated host, a dedicated rack in a co-lo facility, we can’t justify that for a single client for performance.  Maybe for a group of clients we can share the infrastructure and justify it.  But we really want to scale our low-latency platform to all of Nomura’s trading globally.  That cuts down on the number of developers, ports, co-location facilities, exchange gateways.  As for our IT department, well we upgraded our IT department!

Subscribe to our newsletter

Related content


Recorded Webinar: Enhancing Buy-Side Trading Efficiency: Navigating Interoperability and AI in Real Workflows

Enhancing Buy-Side Trading Efficiency: Navigating Interoperability and AI in Real Workflows Emerging capabilities in AI and interoperability are transforming trading workflows, with the promise of heightened levels of collaboration and personalisation resulting in greater efficiency and performance. The potential of these new technologies is encouraging financial firms to modernise their trader desktops and streamline operational...


Lab49 and interop.io Form Strategic Partnership to Enhance Trading Desktop User Experience

Lab49, the specialist technology consultancy owned by ION Group, has partnered with interop.io, the interoperability provider formed when Glue42 and Finsemble merged last year, to provides buy and sell-side firms with comprehensive access to interop.io’s suite of solutions together with bespoke professional services from Lab49. Together, the two companies aim to modernise the trading desktop...


AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.


Entity Data Management & the LEI

Just over a year since the Financial Stability Board handed over leadership and direction of the interim Global Legal Entity Identifier System – or GLEIS – to the Regulatory Oversight Committee (ROC) of the LEI the entity identifier is being used for reporting under European Market Infrastructure Regulation. This report discusses recent developments in the...