About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Q&A: OneMarketData’s Lovas on the Low Latency Arms Race, TCA and Big Data

Subscribe to our newsletter

From time to time, it’s good to take the pulse of the low-latency world to get updated on the status quo, and find out what’s coming next. IntelligentTradingTechnology.com got such insight from Louis Lovas, director of solutions at OneMarketData.

Q: Compared to a year ago, where are trading firms focusing in their efforts to align latency reduction with their business?

A: A lot of effort continues to be devoted to latency reduction in the network pipes, the plumbing layer so-to-speak.  But there is an increased emphasis on looking further up the stack towards the application layer where algorithms execute.  The demands of algo-trading are increasingly creating more sophistication and complexity and also revealing areas for improvements such as transaction costs controls.  Firms are finding improvements in slippage that are a direct corollary to latency.

Q: Within a trading firm, given the many different projects calling for funding, what are the justifications for investment in reducing latency?

A: Latency reduction and its implied benefits does not always mean replacing/upgrading the network plumbing.  Slimmer margins, and the ever increasing difficulty to find alpha means firms have to look in all corners for opportunity.  Greater opportunity can lie elsewhere and come in many shapes and sizes from increasingly complex algos to TCA.  The ability to find that in real-time can come from a hunt through the past – specifically historical data.  It can reveal the biggest bang for your buck in alpha model analysis and slippage analysis.

Q: What is your company doing to help trading firms reduce latency in a way that benefits their business?

A: We are a vendor of trading infrastructure and data analysis solutions.  Those tools focused on the application “algo” layer in the trading “stack”.  We offer firms the high performance tools to analyse markets both in real-time and historically in order to build more sophisticated algos and discover cost improvement opportunities.

Q: Do you think the ‘Low-Latency Arms Race” is over?  Or is it entering a new phase?

A: It will never be over, just morph into a new phase every few years.  As has been in the past and will be in the future an external catalyst will create a new tipping point.  That could be the invention of a new generation of hardware technology creating an order of magnitude performance change or an external force such as a regulatory change that forces a shift of focus. The Market Access Rule banning naked access is one such action.  It has caused brokerage firms to focus on providing the best “microsecond low-latency” access and still conform to the SEC’s rule 15c3-5 which mandates pre-trade risk checks.  The rule ensures everyone pays a latency tax for checking credit limits, and order constraints.  Brokers must enforce it and they’ve responded by engaging in a latency war.  One narrowly defined, of course, but a war nonetheless.

Q: Where do you expect new business opportunities to come from over the next year?

A: There will be a continued focus on transaction cost analysis as firms increasingly become multi-asset; they will look to architect their own TCA solutions because no single broker will provide a complete view across their tradable markets.  TCA may not appear so obviously latency-related but slippage is all about latency.  When you introduce the complexity of trading across multiple ECNs to multiple geographies for equities, futures and FX, price slippage is a constant reminder of the need to focus on latency.

Q: What are some of the technology developments that your company is leveraging in its offerings?

A: The key areas that we focus on for low-latency:

a) Fast access to market data.  The critical importance of strategy decision time is immediacy of pricing data whether from single or multiple sources, order books need to be consolidated and possibly currency converted.  Fast access improves the overall decision time for algos such as spreads and pair trading.

b) Analytical function library.  Increasing algo sophistication implies more calculations and analysis – in realtime.  This should not be diametrically opposed to latency.   We continually focus on this to ensure optimal performance of our analytics

Q: Are there technology developments happening that you’re tracking for possible future use in your offerings?

A: Hadoop and MapReduce is an area we’re looking into planning an R&D effort around.

Q: What is exciting you about the financial markets these days – and how you fit in – right now.

A: The buzz about big data is gaining a lot of attention in finance.  The OneTick product is well architected for consuming the fire hose of big data in finance.  Firms are no longer content to be single asset and they’re branching out to other types of data.  OneTick provides the flexibility, capacity and analytical capability to take on big data’s challenge.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Opportunities of new approaches to electronic trading

Challenged by legacy systems, less than ideal workflows and high costs, front-office trading teams lack the ability to adapt to clients’ evolving needs around integration, speed and multi-asset capabilities. They are also challenged by a capital markets environment characterised by legacy systems, shrinking margins and increased regulatory scrutiny. While these problems cause considerable friction in...

BLOG

Nasdaq Partners with AWS to Build Cloud-Enabled Capital Markets Infrastructure

US exchange operator Nasdaq plans to move its North American markets to the cloud, starting with its Nasdaq MRX options market, under a multi-year partnership with Amazon Web Services (AWS) with the intention of building what it calls the next generation of cloud-enabled infrastructure for financial markets. Nasdaq’s partnership with AWS follows last year’s launch...

EVENT

ESG Regulation, Reporting & Data Management Summit (Redirected)

This summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Practical Applications of the Global LEI – Client On-Boarding and Beyond

The time for talking is over. The time for action is now. A bit melodramatic, perhaps, but given last month’s official launch of the global legal entity identifier (LEI) standard, practitioners are rolling up their sleeves and getting on with figuring out how to incorporate the new identifier into their customer and entity data infrastructures....