The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Q&A: OneMarketData’s Lovas on the Low Latency Arms Race, TCA and Big Data

From time to time, it’s good to take the pulse of the low-latency world to get updated on the status quo, and find out what’s coming next. got such insight from Louis Lovas, director of solutions at OneMarketData.

Q: Compared to a year ago, where are trading firms focusing in their efforts to align latency reduction with their business?

A: A lot of effort continues to be devoted to latency reduction in the network pipes, the plumbing layer so-to-speak.  But there is an increased emphasis on looking further up the stack towards the application layer where algorithms execute.  The demands of algo-trading are increasingly creating more sophistication and complexity and also revealing areas for improvements such as transaction costs controls.  Firms are finding improvements in slippage that are a direct corollary to latency.

Q: Within a trading firm, given the many different projects calling for funding, what are the justifications for investment in reducing latency?

A: Latency reduction and its implied benefits does not always mean replacing/upgrading the network plumbing.  Slimmer margins, and the ever increasing difficulty to find alpha means firms have to look in all corners for opportunity.  Greater opportunity can lie elsewhere and come in many shapes and sizes from increasingly complex algos to TCA.  The ability to find that in real-time can come from a hunt through the past – specifically historical data.  It can reveal the biggest bang for your buck in alpha model analysis and slippage analysis.

Q: What is your company doing to help trading firms reduce latency in a way that benefits their business?

A: We are a vendor of trading infrastructure and data analysis solutions.  Those tools focused on the application “algo” layer in the trading “stack”.  We offer firms the high performance tools to analyse markets both in real-time and historically in order to build more sophisticated algos and discover cost improvement opportunities.

Q: Do you think the ‘Low-Latency Arms Race” is over?  Or is it entering a new phase?

A: It will never be over, just morph into a new phase every few years.  As has been in the past and will be in the future an external catalyst will create a new tipping point.  That could be the invention of a new generation of hardware technology creating an order of magnitude performance change or an external force such as a regulatory change that forces a shift of focus. The Market Access Rule banning naked access is one such action.  It has caused brokerage firms to focus on providing the best “microsecond low-latency” access and still conform to the SEC’s rule 15c3-5 which mandates pre-trade risk checks.  The rule ensures everyone pays a latency tax for checking credit limits, and order constraints.  Brokers must enforce it and they’ve responded by engaging in a latency war.  One narrowly defined, of course, but a war nonetheless.

Q: Where do you expect new business opportunities to come from over the next year?

A: There will be a continued focus on transaction cost analysis as firms increasingly become multi-asset; they will look to architect their own TCA solutions because no single broker will provide a complete view across their tradable markets.  TCA may not appear so obviously latency-related but slippage is all about latency.  When you introduce the complexity of trading across multiple ECNs to multiple geographies for equities, futures and FX, price slippage is a constant reminder of the need to focus on latency.

Q: What are some of the technology developments that your company is leveraging in its offerings?

A: The key areas that we focus on for low-latency:

a) Fast access to market data.  The critical importance of strategy decision time is immediacy of pricing data whether from single or multiple sources, order books need to be consolidated and possibly currency converted.  Fast access improves the overall decision time for algos such as spreads and pair trading.

b) Analytical function library.  Increasing algo sophistication implies more calculations and analysis – in realtime.  This should not be diametrically opposed to latency.   We continually focus on this to ensure optimal performance of our analytics

Q: Are there technology developments happening that you’re tracking for possible future use in your offerings?

A: Hadoop and MapReduce is an area we’re looking into planning an R&D effort around.

Q: What is exciting you about the financial markets these days – and how you fit in – right now.

A: The buzz about big data is gaining a lot of attention in finance.  The OneTick product is well architected for consuming the fire hose of big data in finance.  Firms are no longer content to be single asset and they’re branching out to other types of data.  OneTick provides the flexibility, capacity and analytical capability to take on big data’s challenge.

Related content


Upcoming Webinar: Infrastructure monitoring: mapping technical performance to business performance

Date: 8 July 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes It’s a widely recognized truth that if you can’t measure something, you can’t improve its performance. As high-performance connectivity technologies have established themselves in the mainstream of financial firms’ trading architectures, the ability to monitor messaging and data infrastructures...


Eflow Pushes Performance of Trade Surveillance Platform with Refinitiv Market Data

Eflow, a London-based regulatory compliance firm, has added Refinitiv market data including tick history to its TZ trade surveillance platform. The data expands the company’s existing market data store and will increase the number of instruments and records available for testing in TZ. It will also strengthen the accuracy of any alerts generated. Refinitiv’s fast...


TradingTech Summit Virtual

Trading Tech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.


Entity Data Management Handbook – Second Edition

Entity data management is this year’s hot topic as financial firms focus on entity data to gain a better understanding of customers, improve risk management and meet regulatory compliance requirements. Data management programmes that enrich the Legal Entity Identifier with hierarchy data and links to other datasets can also add real value, including new business...