The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Q&A: OneMarketData’s Louis Lovas on TCA, Latency and Big Data

Transaction Cost Analysis (TCA) is of increasing importance to trading success and while it is very much a business imperative, implementing it and putting it to effective use relies on a combination of low-latency technologies and big data approaches. got the TCA low down from Louis Lovas, director of solutions at OneMarketData.

Q: To start, can you briefly explain what Transaction Cost Analysis is, and why it’s increasingly of interest to trading firms?

A: Transaction Cost Analysis is the idea of monitoring and reporting on trade performance.  It is certainly not a new concept, buy side firms have long been leveraging broker TCA services to analyse their executions and to optimise equity order flow.

Yet with tighter spreads, thinner margins and a lower risk appetite, trading firms are seeking ways to find and preserve alpha.  Trading opportunities in new asset classes such as FX and Futures along with cross asset models are being designed and deployed.  And as such managing and analysing trade costs is of paramount importance.

A goal of TCA is to achieve a seamless analysis of executions across all tradable markets and provide recommendations that are actionable to strategy behavior.  This requires profiling order flow in the traditional (historical) sense measuring the performance of individual brokers and venues and also monitoring strategies in real-time.  Execution analysis is also a tool for liquidity analysis highlighting value and toxicity across fragmented markets.

Q: What does a trading firm require in order to implement TCA?

A: TCA wants to accurately measure costs for the purpose of executing orders in a more efficient way.  It will also profile trader and broker behavior – looking at intra-day execution efficiencies and highlighting the impact of delays on strategy performance. Including implicit costs or slippage – measured as implementation shortfall, opportunity costs such as crossing the spread and of course explicit costs, commissions and transaction fees.

TCA requires a number of key technology components; an analytical framework that can provide value across time continuums; historic, intra-day and real-time to profile market conditions and participants behaviors.  That framework requires a broad array of analytical functions for analysing order flow, executions and market structure.

To support the analytical framework and seamlessly thread historic and real-time data for TCA analysis requires a high performance processing engine such as Complex Event Processing.  CEP can semantically combine numerous data sources – markets and execution venues – as the bedrock for complex TCA computations and provide integration to trading systems for feedback into strategy logic.

For the capture and storage of the vast volume of data, a time series database is the final major component needed for a capable TCA infrastructure.

Lastly, is the face of TCA.  Here, visual design tooling to create graphical representations of execution patterns, fills and liquidity; they make the data intuitively obvious.

Q: And how does TCA impact – or have relevance to – low-latency technologies?

A: Achieving alpha in today’s competitive trading landscape can be a complex recipe.  It includes low latency data delivery, fast execution technology and analysis tools against a fragmented market structure.  And determining exactly where to execute at any given moment has become a key driver for best execution.  This includes the generation of real-time alerts on market conditions fed back into strategy logic so they can be modified mid-day.

TCA can and should monitor order executions in real-time to provide advice to traders and strategy logic.  The continuous analysis of executing orders, their realised and unrealised P&L, can indicate in real-time the implicit opportunity costs.

Q: How do big data approaches support TCA?

A: Fire hose volumes exist across all markets.  This is only exacerbated by the hunt for alpha stretching across multiple asset classes and geographies.   In finance, there is a paramount need for reliability and accuracy, it is what distinguishes financial big data from all others.

Price accuracy of the individual constituents and transparency across a multiplicity of markets is a mandate, be it a stock symbol, currency pair or futures contract captured across the broad expanse of time from the past, the present and projected into the future.

TCA as post-trade exercise compares actual execution performance against a variety of benchmarks such as VWAP.  The analysis can show historic exposure, prices and volatility against participation rates and other metrics.

The data management and storage technology is a key component that enables this analysis. The technology has to handle consuming the high volume of data across markets and the variety including price content, orders and executions.

Q: How is TCA evolving as trading firms move into executing on multiple exchanges, and asset classes?

A: The elusive search for alpha is a hunt across markets, geographies and controlling costs.  Understanding the implicit transaction costs of implementing an investment idea is essential for the simple act of buying or selling a security.  Increased sophistication in cross-border equity trading is no longer satisfied with leaving FX execution to custodians. Next-generation TCA must apply equally to other asset classes beyond equities. Incorporating currency rates in real-time will indicate if the equity trade is offset by the cost of currency.

Q: How does OneMarketData fit into the TCA challenge?

A: Transaction cost decision analysis leveraged by buy side trading desks can improve their ability to capture alpha.  It is an effective competitive weapon to outpace the market and to improve profit margin behaviour.

OneMarketData provides an analysis and data management platform – OneTick is designed to equip the buy side and asset managers to achieve their profit goals.  OneTick combines an analytics engine, complex event processing and time series (tick) database as a single technology platform.  It is capable of coalescing the differences between exchanges (symbologies, data formats), converting currencies, manage continuous futures contracts, injecting corporation actions and numerous other data management functions.

OneTick has over 100 built-in functions for a wide variety of analytics such as determining accurate price depth, market transparency and liquidity analysis within and across markets with nanosecond precision.  OneTick can capture and track orders and fills both in real-time and historically for profiling trader and broker behavior.  Intra-day execution patterns and market impact can be assessed for executing orders in a more efficient way.

The increased sophistication in trading algorithms crossing into multiple asset classes and spanning the globe has led the way for refinements in transaction cost analysis.  The buy-side, asset managers and quantitative shops are reaching out to the same technology for trading to custom build TCA solutions.  Those solutions allow them to achieve a broad view into algo behavior, their costs and the ability to measure and control them.

Related content


Recorded Webinar: Trade surveillance: Deploying monitoring and surveillance capabilities for today’s new normal

Let’s face it: The old ways aren’t coming back. A plethora of challenges brought on by the covid-19 pandemic, coupled with unrelenting market volatility and uncertainty, have pushed financial service firms to look for rigorous monitoring and surveillance solutions to meet the demands of the emerging trading landscape. Working from home (WFH) has increased the...


Quod Financial Launches Middle Office Suite to Automate Trading Workflow

Quod Financial has added a middle office component to its multi-asset OMS/EMS platform, aimed at allowing customers to automate complex trading workflows. Quod Middle Office is a hosted STP platform that seeks to address inefficient manual processes across all asset classes including equities, derivatives, foreign exchange and fixed income, as well as digital assets. The...


LIVE Briefing: ESG Data Management – A Strategic Imperative

This breakfast briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.


Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...