About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Is Low Latency the New Disaster Recovery?

Subscribe to our newsletter

“Everyone wants low latency … the trouble is no one wants to pay for it,” were words spoken recently by a senior executive of a major financial IT vendor. It was a private meeting, so I won’t name the individual or the company, but what was said resonated with me, because it echoed my increasing views. For me, investment in low-latency technology has become similar to investing in disaster recovery – essential and important, but not a core business focus, or really very exciting.

There’s no doubt that the extreme focus on latency reduction – the “low latency arms race” and “the race to zero” or whatever – is for the most part over – especially when it comes to exchange-based markets like equities, options and futures in established markets. Essentially, trading firms have largely spent as much as they are going to in order to reduce latency, and any further spend needs a strong justification in terms of ROI.

Of course, there is still plenty of low-latency action to support high frequency trading (HFT) and similar strategies – though fewer firms are engaging in that activity these days. Wireless services, co-lo, over-clocked servers, FPGAs continue to be directed to these activities.

There’s also a continued spend on latency reduction for other markets, such as foreign exchange, and the introduction of swap execution facilities and similar centralised trading hubs will also drive it. Also, emerging markets are investing as they seek to become global players. Many IT vendors are moving their sales focus to address these opportunities.

There is also investment in new technology that will reduce operational costs over time, and move spend from the capital to operating budget. Managed services for connectivity, SaaS offerings for execution management, power-efficient infrastructure and data centres are all in vogue in order to “get costs out of the business.” 

So money is being spent on reducing latency. It’s just that it’s money that is now being spent somewhat reluctantly, similar in mindset to spend on disaster recovery, on security or on regulatory compliance. It’s not the best sign for IT innovation, which is generally driven by the promise of new business opportunities, however inflated and tenuous.

Over the next few weeks and months, Low-Latency.com will adapt and transform to cover these new normalities, including the emergence of big data technologies in automated trading. So watch this space.

Comments are welcome. Happy end of summer everyone!

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Enhancing trader efficiency with interoperability – Innovative solutions for automated and streamlined trader desktop and workflows

Traders today are expected to navigate increasingly complex markets using workflows that often lag behind the pace of change. Disconnected systems, manual processes, and fragmented user experiences create hidden inefficiencies that directly impact performance and risk management. Firms that can streamline and modernise the trader desktop are gaining a tangible edge – both in speed...

BLOG

IEX Selects DataBP Platform to Modernise Market Data Administration

The Investors’ Exchange (IEX) has adopted DataBP’s market data management platform to streamline its commercial data operations, aiming to enhance efficiency across licensing, reporting, and compliance workflows. The move is part of a broader strategy by IEX to simplify its administrative processes as it expands. According to Mark Schaedel, CEO of DataBP, the project was...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...