About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

The Race to Zero – Three Rules for Winning

Subscribe to our newsletter

Deutsche Boerse Group, one of the world’s leading financial exchanges, recently developed a new ultra-low-latency trading infrastructure linking Frankfurt to five other key worldwide trading centres. The target for the Frankfurt-London link was 5 milliseconds (0.005 seconds).

Pushing the limits of how quickly computers can process instructions is the new battleground in finance. True zero latency may be mathematically impossible, but as technology continually shatters the benchmarks of the past – moving transaction speeds from milliseconds to microseconds (millionths of a second) and even to nanoseconds (billionths of a second) – it is quickly closing the gap. Today, the industry average for a typical pre-trade risk check is about 125 microseconds (0.000125 seconds) and getting faster all the time.

You don’t achieve such exponential performance breakthroughs by doing things the same way you always did. And yet, nobody in the financial services industry has the luxury of being able to rip and replace their mission-critical systems. Today’s environment requires a new set of rules.

Rule 1: Build on your existing systems to minimise business disruptions.

Compromising existing applications or structural components may be the fastest route to a big step up in performance, but that approach is a nonstarter. The systems in use at today’s firms are well defined, secure and highly leveraged. Companies need an additive approach – one that will leverage their existing investments and prevent disruptions to the business while making it easy to introduce new components that will shave latency from transaction times.

Hardware and Software in Parallel. What companies can’t do is rely solely on off-the-shelf products or routine data centre upgrades to achieve their performance objectives. Faster hardware arranged in an efficient, modular architecture will surely be part of every firm’s ‘race to zero’ strategy, but hardware alone can’t do the job.

At its root, optimisation is a software problem, but even the most advanced, highly parallelised processing engine is limited by the hardware on which it is deployed. A system that can process 1 million messages per second is still operating at only 50% of its true potential if it isn’t deployed on multi-core servers with distributed management. The race to zero will be won by companies that are willing to harness the beauty and benefits of hardware and software simultaneously.

Rule 2: Use accelerative hardware and software in strategic combination.

Finance’s new real-time requirements are fueling a cottage industry: boutique technology vendors intensely focused on addressing the ultra-low-latency problem through innovation. And, not surprisingly, they are doing this precisely by crossing the hardware-software chasm.

Solace Systems and 29 West (recently acquired by Informatica) are among the leaders in this boutique industry. These companies have embedded high-performing silicon-based optimization systems on very specialized hardware appliances designed to let the software operate with maximum speed and reliability. It is a specialist infrastructure intended to meet the specific high-performance requirements of capital markets firms. Despite being a relatively new technology, such trading appliances can deliver astonishing results in raw processing performance, which ultimately extends the boundaries of trading volumes.

These vendors are approaching the challenge much like a Formula One racing team. Naturally, the engine of an elite racing car must be designed for high performance, rapid acceleration and sustained high speeds. But these innovations cannot be fully realised if the auto chassis creates drag, or if the fuel grade or the tires are inferior. When the right choices are made on the outside, the engine is liberated from hurdles that would constrain its performance.

Similarly, when hardware and software are considered in unison, the sky is the limit.

High performance is its own reward, but companies must be able to achieve their ultra-low-latency goals without causing any disruption to the business (see Rule 1). That’s why valuable market innovations such as trading appliances conform to today’s best practices for distributed platform architecture.

Rule 3: Adopt a next-generation platform architecture to harness complexity and minimise latency.

To achieve ultra-low latency without compromising existing systems, companies need a platform strategy that can achieve massive scale-out for data analytics and massive scale-up for extreme transactional environments. The Next Generation Information Platform (NGIP) is such a strategy. NGIP inherits from innovations across the data-centric set of IT components: Faster processors and a larger number of cores, larger memory, faster interconnects and so forth all play a role in allowing applications to scale better.

Macro technology trends, such as memory-centric computing and cloud computing, are another important facet of NGIP. By using the cloud, NGIP will satisfy a firm’s growing demands for event-driven processing at lightning speeds. It also enables the flexibility to support whatever application- or data-tier innovations the firm elects to invest in.

Leading trading firms rely heavily on news and event data streams while also tracking trade history and price fluctuations. As a result, the volume of complex event data that automated trade systems must analyse is massive. The key to managing this data volume is to perform resource-intensive computations in firms’ private cloud environments so that transactions and analytics occur quickly and efficiently. NGIP’s cloud-centric approach means firms can obtain the computing power they need quickly and painlessly.

Using the cloud doesn’t mean giving up control: With NGIP architecture, capital markets firms can invest in whichever technologies they choose to make their trading systems more efficient. NGIP facilitates a close affinity between hardware and software without eschewing the tremendous opportunity and cost-savings potential offered by the cloud.

Finally, NGIP adds value without upsetting existing IT functions. It works like a pipe inserted into an IT system, circumventing traditional bottlenecks and allowing high-volume messages (beyond 1 million messages per second is the new standard for fund managers) to flow at near-zero latency – without disrupting downstream heritage systems.

Toward Zero Latency

Getting to near-zero latency isn’t trivial, especially in light of today’s aggressive requirements for collection, storage and analysis of complex data. But it isn’t beyond reach, either. A flexible architecture that embraces hardware and software synergy without introducing business disruption is a trifecta that can push you to the front of the pack in the race to zero.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Re-architecting the trading platform for interoperability, resilience and profitability

Trading platforms have come a long way since the days of exchanging paper certificates and shouting across trading floors, pits and desks in the early 2000s, but there is progress still to be made as firms strive to reduce risk, increase profitability, and make their mark in digital assets trading. This webinar will review the...

BLOG

Overcoming the Challenges of Expanding into New Markets and Moving Towards the Cloud: The BSO Blueprint

For trading firms who see diversification as a means of uncovering alpha, the challenges of entering new markets can be daunting. From navigating unfamiliar regulatory landscapes to ensuring optimal connectivity, the journey is fraught with potential pitfalls. In this Q&A, we talk to Michael Ourabah, CEO of Infrastructure & connectivity provider BSO, about how firms...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Data Lineage Handbook 2019

Welcome to our latest handbook on data lineage, a critical concern for data managers working to achieve regulatory compliance, deliver operational gains, and provide meaningful value to the business. The handbook covers the complete scope of data lineage, with a view to helping you win management buy-in and budget, decide whether to build or buy...