About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Q&A: NovaSparks’ Yves Charles on Doing It All on FPGAs!

Subscribe to our newsletter

FPGAs are growing up and spreading their wings.  Only a couple of years ago, their use for market data handling was considered cutting edge, whereas now it’s almost mainstream.  But what about using them for more complex functionality? IntelligentTradingTechnology.com spoke to NovaSparks CEO Yves Charles on pushing the complexity, and the challenges involved.

Q: You just added Order Book building functionality to your market data handling infrastructure.  What does it do?

A: The FPGA Order Book building tracks all new, current, modified, canceled and executed orders to build a consolidated view of the open limit orders placed behind each symbol.  In many cash equity markets, the job of tracking the messages and building the book  is left up to the market data vendor.  Market participants are often looking for a consolidated view of these orders showing the best N prices on both sides of the bid/ask spread with associated aggregated quantities.   Clients can configure the depth of book that they want to track.

This is an added feature of the FPGA Feed Handlers and it is being rolled out across the major cash equity feeds in Europe and North America.

Q: Like your market data feed handlers, the Order Book functionality runs on FPGA processors.  What kind of performance can be achieved?

A: All our products are designed with the FPGA Market Data Matrix architecture which offers ultra-low – and – deterministic latency.  The line handling, parsing, normalisation and book building is done within one microsecond, 99% of the time.

Q: One of the key performance benefits of using FPGAs is determinism.  Why is this so important?

A: Determinism is the ability to keep to the same latency regardless of the data rate, market bursts or number of downstream users.  This means market data vendors can begin to offer guaranteed service levels.  It also means the algo trader can depend on a steady reaction time to changing market signals.

With other systems, latency can spike during volatile markets, which is exactly when the algo trader wants the utra-low latency.  What is so disturbing to the algo traders is when a market bursts happens, they have no idea what the real latency is – it may be 50 microseconds – or it could be more.  So they end up pulling out of the market for fear they will be picked off.

A pure FPGA architecture, which avoids all the system bottlenecks CPU systems have, can bring determinism.

Q: This implementation seems to prove that FPGAs can be used for more complex processing than simple data feed handling.  What was your experience of developing this functionality?  Were there technical obstacles that you needed to overcome?

A: We do see this as a proof point of how FPGAs can be used for more complex and advanced functions than just simple parsing or decoding or networking functions.  This demonstrates to the market that FPGAs are viable platforms for a lot of different types of calculation functions in the trading cycle.

Without giving away any trade secrets, I can say that there are some very complex algorithms and mechanisms within the FPGAs that we have designed.  The core challenge is to do the entire process without having bottlenecks at any stage.

Regarding programming agility, in the last few years, there haven’t been too many improvements in the use of compilers for FPGAs.  Compilers are very widely used in other applications where latency is not an issue, but when you are dealing with FPGA and when the driving point is latency, compilers reach their limits.

So we decided to build our own compilers rather than use anything off-the-shelf.  In our development team, most of the code that changes from one exchange feed-handler to another is written via our compilers.

Q: What plans do you have to build further complex functionality to run on your FPGA matrix?

A: The FPGA Market Data Matrix is scalable by interconnecting FPGA boards and by interconnecting each FPGA appliance.  This scalability enables us to tackle more features and more calculations as well as to open up the matrix for client’s direct use.

Q: Given the business landscape, do trading firms still need this cutting edge latency.  Where is the market for your offerings in the future?

A: When it comes to deterministic latency, everyone is interested.  Plus the FPGA architecture offers smaller footprints, and lower power consumption.  As the features increase, the interest will be broader and broader.

When we speak about determinism, it’s not just of interest to high frequency traders … every trader is interested.

We think it is only a matter of time that the world can be run on FPGAs!

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Leveraging cloud as part of your colocation or proximity hosting strategy

As firms assess which elements of the trading workflow can be profitably migrated to cloud environments, ensuring compatibility with colocation and proximity hosting facilities becomes paramount. Practitioners need to take into account data access and security, messaging latency and regulatory considerations around operational resilience and physical location of client data. This webinar will discuss best...

BLOG

The Modernisation Imperative

In today’s fast-paced financial markets, institutions face a number of challenges. As well as having to cope with shrinking margins and ever-increasing regulatory scrutiny, firms are having to process more data from more sources than ever before, all in an increasingly competitive landscape. With everything moving so fast and changing so often, financial institutions are...

EVENT

Virtual Briefing: ESG Data Management – A Strategic Imperative (Redirected)

This briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Managing Valuations Data for Optimal Risk Management

The US corporate actions market has long been characterised as paper-based and manually intensive, but it seems that much progress is being made of late to tackle the lack of automation due to the introduction of four little letters: XBRL. According to a survey by the American Institute of Certified Public Accountants (AICPA) and standards...