About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Q&A – From Yesterday’s London Low-Latency Summit

Subscribe to our newsletter

Industry experts in the morning panels at yesterday’s Low-Latency Summit in London responded to a number of questions posed by moderator Pete Harris of IntelligentTradingTechnology.com. Here is some of the wisdom shared …

Q: Trading firms want to be smarter, leveraging technology to be wiser with respect to their trading strategies.  Do you agree?  Where is the focus?

Frederic Ponzo, GreySpark Partners: The latency numbers are getting smaller, but the constants are still there.  And it’s across asset classes now.

Dan Solak, Thomson Reuters: People are looking for the best way, rather than simply the fastest way.  Today, the biggest requirement is normalisation, once you get beyond the single vendor environment, you have to normalise those services.

Louis Lovas, OneMarketData: If you look at the extremes, Warren Buffett vs Getco and Knight, you have either end of the spectrum there with most people somewhere in between.

James Andrews, Informatica: Connectivity to and from the markets is less important than it was.  We are seeing push to other, ancillary areas; a push into back and middle office.  Also, firms are no longer seeking lowest latency at any cost.  Stability is way more important than raw speed.

Q: FX is a big, fluid market, but one that’s traditionally thought of as less latency sensitive.  But I see Redline has FPGA systems for FX.  So, why?

Ponzo: It’s true to say that the most effort in low latency reduction today is in the foreign exchange market.  It’s a simple market. Anyone can price an FX pair and do it quickly.  The other reason is that it’s highly liquid, which means spreads are tight.  If you’re out of the market, you’ll get picked off right away.

Lovas: If you can get access to the data, you can make faster trading decisions.

Andrews: We are seeing massive low latency growth in the FX area. There is spend on co-lo and on FPGAs.  They’re at the cutting edge.

Solak: Remember there is a correlation with equities.  Multi-venue, multi-country means multi-currency trading.

Q: What’s the impact of jitter on trading algos?

Benjamin Stephens, Nomura: We route institutional orders to exchanges.  With a passive order, if we have jitter, we may post an order to an exchange that’s not receiving orders as expected.  But if we’re talking about aggressive orders, there is more impact.  We may miss a fill.

We need systems to identify root cause of slippage.  Monitoring our network has been the biggest step toward identifying problems.  The next step is to optimise ourfeed handlers.

David Quarrell, OnX: We need a holistic view of network and servers.  Only then can you understand what’s causing slippage.  So there’s a lot of behaviour to analyse.

Mohammad Darwish, AdvancedIO: Focus on network to application.  You need low latency and sustained high bandwidth.  You need to know when events are going to happen.

Henry Young, TS-Associates: Impact of jitter as we see it is most prevalent among market-makers in the U.S. markets.  They are shifting huge amounts of liquidity around and they need to hit as fast as possible. To them, jitter equals risk.

Q: So what are firms doing to address jitter?  What more could be done?

Darwish: You want to minimise latency, and you want to sustain bandwidth.  The operating system in financial markets tends to be Linux or Windows whereas in the military it’s VXWorks, which is a true real-time O/S.

Quarrell: When reducing latency, don’t go to your IT department. Build stack from vendor offerings.  But remember you’re a bank, a trader.  Not a technologist.  So you need outside help.

In a FIX benchmarking test we looked at a distribution of 2 million messages.  Commercial engines generated a bell curve distribution. An open source distribution looked like a broken molar.  Very uneven.  That said, 300 microseconds latency is not bad.

Stephens: We’ve been running FPGA projects for three years.  We look at how to move as much processing as possible out of the critical path.  We use FPGA for risk checks in low latency, and we’re about to start processing FIX that way.  We’re moving away from Java for low-latency checks.

We are focusing on making the system scalable.  We can save $500K a year on one exchange connection by optimising capacity. If we have a dedicated host, a dedicated rack in a co-lo facility, we can’t justify that for a single client for performance.  Maybe for a group of clients we can share the infrastructure and justify it.  But we really want to scale our low-latency platform to all of Nomura’s trading globally.  That cuts down on the number of developers, ports, co-location facilities, exchange gateways.  As for our IT department, well we upgraded our IT department!

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

Beyond the Monolith: Crafting the Agile Trading Stack for the Modern Era

For decades, the central question for any firm designing its trading systems architecture has been a seemingly binary choice: buy an off-the-shelf platform or build a proprietary one in-house? The ‘buy’ camp argued for speed to market and vendor-managed upkeep, while the ‘build’ camp championed bespoke functionality and control over intellectual property. Today, this long-standing...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Entity Data Management Handbook – Second Edition

Entity data management is this year’s hot topic as financial firms focus on entity data to gain a better understanding of customers, improve risk management and meet regulatory compliance requirements. Data management programmes that enrich the Legal Entity Identifier with hierarchy data and links to other datasets can also add real value, including new business...