About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Q&A: November’s Low Down on Latency with Pete Harris

Subscribe to our newsletter

The Low-Latency Summit in New York City the other week created a lot of discussion and lots of questions to me. So here is a sampling from that busy day, along with my thoughts.

Q: One of the conference sessions was focused on latency reduction and ROI. What was your take away from that?

A: Less trading firms are engaged in the ‘low latency arms race’ – but there still are a good few in the race to zero. There is more understanding of the need for ROI though measurement of it is patchy and not that scientific. What’s certain is that most firms are putting more thought into their latency reduction projects, and overall they are taking longer to make decisions.

Q: So are there new markets for low latency technology?

A: The FX markets are adopting it, even though the latencies are not as extreme as in the equities market. Fixed income markets look to be the next adopters. Plus there are new geographies to tap, so there’s still much demand.

Q: What are the new technology trends for low latency?

A: It looks like embedding intelligence in the network is one. Arista’s 7124FX is one example, where FPGA technology is in the network switch. Also Pluribus Networks has married Intel Sandy Bridge chips to its switch … that’s what Tibco Software is using for its FTL Message Switch. I think we’ll see more of this in the future.

Q: So is the future FPGA or Intel?

A: Both.  And don’t forget AMD with its Piledriver chips. I think there will continue to be debate and new developments in both the mainstream x86 world, and with hardware acceleration. There is clearly momentum behind both approaches. Over time, we might see a natural order develop for what is the best approach for specific applications or functions.

Q: Big data was discussed a bit in one of the sessions. Is it really applicable to low latency?

A: Yes but it’s early days. The leveraging of time series during trade execution is emerging, as is event driven trading based on news and social media inputs. But as was pointed out, financial services in general is not the leader in big data adoption. Possibly it might be the industry that gets most return from it, though.

Got a question for me to mull on over the holidays? Drop me a line at pete@low-latency.com.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Enhancing trader efficiency with interoperability – Innovative solutions for automated and streamlined trader desktop and workflows

Traders today are expected to navigate increasingly complex markets using workflows that often lag behind the pace of change. Disconnected systems, manual processes, and fragmented user experiences create hidden inefficiencies that directly impact performance and risk management. Firms that can streamline and modernise the trader desktop are gaining a tangible edge – both in speed...

BLOG

Osaka Exchange Adopts Nasdaq Eqlipse Technology to Modernise Derivatives Platform

Osaka Exchange has selected Nasdaq’s Eqlipse Trading and Market Surveillance platforms as part of a major modernisation of its derivatives infrastructure, positioning the exchange for increased scale, resilience, and long-term competitiveness. The deployment will see OSE adopt Nasdaq’s next-generation trading and surveillance technology as the foundation for a new derivatives platform, supporting multi-asset trading, ultra-low...

EVENT

AI in Capital Markets Summit London

Now in its 3rd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...