About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Is Low Latency the New Disaster Recovery?

Subscribe to our newsletter

“Everyone wants low latency … the trouble is no one wants to pay for it,” were words spoken recently by a senior executive of a major financial IT vendor. It was a private meeting, so I won’t name the individual or the company, but what was said resonated with me, because it echoed my increasing views. For me, investment in low-latency technology has become similar to investing in disaster recovery – essential and important, but not a core business focus, or really very exciting.

There’s no doubt that the extreme focus on latency reduction – the “low latency arms race” and “the race to zero” or whatever – is for the most part over – especially when it comes to exchange-based markets like equities, options and futures in established markets. Essentially, trading firms have largely spent as much as they are going to in order to reduce latency, and any further spend needs a strong justification in terms of ROI.

Of course, there is still plenty of low-latency action to support high frequency trading (HFT) and similar strategies – though fewer firms are engaging in that activity these days. Wireless services, co-lo, over-clocked servers, FPGAs continue to be directed to these activities.

There’s also a continued spend on latency reduction for other markets, such as foreign exchange, and the introduction of swap execution facilities and similar centralised trading hubs will also drive it. Also, emerging markets are investing as they seek to become global players. Many IT vendors are moving their sales focus to address these opportunities.

There is also investment in new technology that will reduce operational costs over time, and move spend from the capital to operating budget. Managed services for connectivity, SaaS offerings for execution management, power-efficient infrastructure and data centres are all in vogue in order to “get costs out of the business.” 

So money is being spent on reducing latency. It’s just that it’s money that is now being spent somewhat reluctantly, similar in mindset to spend on disaster recovery, on security or on regulatory compliance. It’s not the best sign for IT innovation, which is generally driven by the promise of new business opportunities, however inflated and tenuous.

Over the next few weeks and months, Low-Latency.com will adapt and transform to cover these new normalities, including the emergence of big data technologies in automated trading. So watch this space.

Comments are welcome. Happy end of summer everyone!

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

EuroCTP Taps DataBP to Build Digital-First Administration for EU Consolidated Tape

In a critical step toward the operational reality of a European consolidated tape, EuroCTP has selected DataBP, specialists in commercial management solutions for financial market data, to administer the licensing and subscriber management for its forthcoming equities and ETFs consolidated tape. The move signals a foundational focus on modern, digital-first infrastructure for what is poised...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Enterprise Data Management, 2010 Edition

The global regulatory community has become increasingly aware of the data management challenge within financial institutions, as it struggles with its own challenge of better tracking systemic risk across financial markets. The US regulator in particular is seemingly keen to kick off a standardisation process and also wants the regulatory community to begin collecting additional...