About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Q&A: NovaSparks’ Yves Charles on Doing It All on FPGAs!

Subscribe to our newsletter

FPGAs are growing up and spreading their wings.  Only a couple of years ago, their use for market data handling was considered cutting edge, whereas now it’s almost mainstream.  But what about using them for more complex functionality? IntelligentTradingTechnology.com spoke to NovaSparks CEO Yves Charles on pushing the complexity, and the challenges involved.

Q: You just added Order Book building functionality to your market data handling infrastructure.  What does it do?

A: The FPGA Order Book building tracks all new, current, modified, canceled and executed orders to build a consolidated view of the open limit orders placed behind each symbol.  In many cash equity markets, the job of tracking the messages and building the book  is left up to the market data vendor.  Market participants are often looking for a consolidated view of these orders showing the best N prices on both sides of the bid/ask spread with associated aggregated quantities.   Clients can configure the depth of book that they want to track.

This is an added feature of the FPGA Feed Handlers and it is being rolled out across the major cash equity feeds in Europe and North America.

Q: Like your market data feed handlers, the Order Book functionality runs on FPGA processors.  What kind of performance can be achieved?

A: All our products are designed with the FPGA Market Data Matrix architecture which offers ultra-low – and – deterministic latency.  The line handling, parsing, normalisation and book building is done within one microsecond, 99% of the time.

Q: One of the key performance benefits of using FPGAs is determinism.  Why is this so important?

A: Determinism is the ability to keep to the same latency regardless of the data rate, market bursts or number of downstream users.  This means market data vendors can begin to offer guaranteed service levels.  It also means the algo trader can depend on a steady reaction time to changing market signals.

With other systems, latency can spike during volatile markets, which is exactly when the algo trader wants the utra-low latency.  What is so disturbing to the algo traders is when a market bursts happens, they have no idea what the real latency is – it may be 50 microseconds – or it could be more.  So they end up pulling out of the market for fear they will be picked off.

A pure FPGA architecture, which avoids all the system bottlenecks CPU systems have, can bring determinism.

Q: This implementation seems to prove that FPGAs can be used for more complex processing than simple data feed handling.  What was your experience of developing this functionality?  Were there technical obstacles that you needed to overcome?

A: We do see this as a proof point of how FPGAs can be used for more complex and advanced functions than just simple parsing or decoding or networking functions.  This demonstrates to the market that FPGAs are viable platforms for a lot of different types of calculation functions in the trading cycle.

Without giving away any trade secrets, I can say that there are some very complex algorithms and mechanisms within the FPGAs that we have designed.  The core challenge is to do the entire process without having bottlenecks at any stage.

Regarding programming agility, in the last few years, there haven’t been too many improvements in the use of compilers for FPGAs.  Compilers are very widely used in other applications where latency is not an issue, but when you are dealing with FPGA and when the driving point is latency, compilers reach their limits.

So we decided to build our own compilers rather than use anything off-the-shelf.  In our development team, most of the code that changes from one exchange feed-handler to another is written via our compilers.

Q: What plans do you have to build further complex functionality to run on your FPGA matrix?

A: The FPGA Market Data Matrix is scalable by interconnecting FPGA boards and by interconnecting each FPGA appliance.  This scalability enables us to tackle more features and more calculations as well as to open up the matrix for client’s direct use.

Q: Given the business landscape, do trading firms still need this cutting edge latency.  Where is the market for your offerings in the future?

A: When it comes to deterministic latency, everyone is interested.  Plus the FPGA architecture offers smaller footprints, and lower power consumption.  As the features increase, the interest will be broader and broader.

When we speak about determinism, it’s not just of interest to high frequency traders … every trader is interested.

We think it is only a matter of time that the world can be run on FPGAs!

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlock the Future of European Equities & Trading Technology: 2024 and Beyond!

In a world where the financial landscape is perpetually evolving, 2023 has brought widespread discussions around liquidity, regulatory shifts in the EU and UK, and advancements like the consolidated tape in Europe. For the year ahead in 2024, the European market is poised for transformative changes that will influence the future of trading technology and...

BLOG

Why “Last Look” Needs a New Look

By Daniel Chambers, BidFX, Head of Data & Analytics at BidFX (an SGX company). For what seems like an eternity now, the controversy surrounding “last look” has hung over the FX industry like a dark cloud. The practice involves a liquidity provider, like an investment bank or market maker, rejecting or requoting a trade after...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2014

Welcome to the inaugural edition of the A-Team Regulatory Data Handbook. We trust you’ll find this guide a useful addition to the resources at your disposal as you navigate the maze of emerging regulations that are making ever more strenuous reporting demands on financial institutions everywhere. In putting the Handbook together, our rationale has been...