About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Q&A: OneMarketData’s Louis Lovas on TCA, Latency and Big Data

Subscribe to our newsletter

Transaction Cost Analysis (TCA) is of increasing importance to trading success and while it is very much a business imperative, implementing it and putting it to effective use relies on a combination of low-latency technologies and big data approaches. IntelligentTradingTechnology.com got the TCA low down from Louis Lovas, director of solutions at OneMarketData.

Q: To start, can you briefly explain what Transaction Cost Analysis is, and why it’s increasingly of interest to trading firms?

A: Transaction Cost Analysis is the idea of monitoring and reporting on trade performance.  It is certainly not a new concept, buy side firms have long been leveraging broker TCA services to analyse their executions and to optimise equity order flow.

Yet with tighter spreads, thinner margins and a lower risk appetite, trading firms are seeking ways to find and preserve alpha.  Trading opportunities in new asset classes such as FX and Futures along with cross asset models are being designed and deployed.  And as such managing and analysing trade costs is of paramount importance.

A goal of TCA is to achieve a seamless analysis of executions across all tradable markets and provide recommendations that are actionable to strategy behavior.  This requires profiling order flow in the traditional (historical) sense measuring the performance of individual brokers and venues and also monitoring strategies in real-time.  Execution analysis is also a tool for liquidity analysis highlighting value and toxicity across fragmented markets.

Q: What does a trading firm require in order to implement TCA?

A: TCA wants to accurately measure costs for the purpose of executing orders in a more efficient way.  It will also profile trader and broker behavior – looking at intra-day execution efficiencies and highlighting the impact of delays on strategy performance. Including implicit costs or slippage – measured as implementation shortfall, opportunity costs such as crossing the spread and of course explicit costs, commissions and transaction fees.

TCA requires a number of key technology components; an analytical framework that can provide value across time continuums; historic, intra-day and real-time to profile market conditions and participants behaviors.  That framework requires a broad array of analytical functions for analysing order flow, executions and market structure.

To support the analytical framework and seamlessly thread historic and real-time data for TCA analysis requires a high performance processing engine such as Complex Event Processing.  CEP can semantically combine numerous data sources – markets and execution venues – as the bedrock for complex TCA computations and provide integration to trading systems for feedback into strategy logic.

For the capture and storage of the vast volume of data, a time series database is the final major component needed for a capable TCA infrastructure.

Lastly, is the face of TCA.  Here, visual design tooling to create graphical representations of execution patterns, fills and liquidity; they make the data intuitively obvious.

Q: And how does TCA impact – or have relevance to – low-latency technologies?

A: Achieving alpha in today’s competitive trading landscape can be a complex recipe.  It includes low latency data delivery, fast execution technology and analysis tools against a fragmented market structure.  And determining exactly where to execute at any given moment has become a key driver for best execution.  This includes the generation of real-time alerts on market conditions fed back into strategy logic so they can be modified mid-day.

TCA can and should monitor order executions in real-time to provide advice to traders and strategy logic.  The continuous analysis of executing orders, their realised and unrealised P&L, can indicate in real-time the implicit opportunity costs.

Q: How do big data approaches support TCA?

A: Fire hose volumes exist across all markets.  This is only exacerbated by the hunt for alpha stretching across multiple asset classes and geographies.   In finance, there is a paramount need for reliability and accuracy, it is what distinguishes financial big data from all others.

Price accuracy of the individual constituents and transparency across a multiplicity of markets is a mandate, be it a stock symbol, currency pair or futures contract captured across the broad expanse of time from the past, the present and projected into the future.

TCA as post-trade exercise compares actual execution performance against a variety of benchmarks such as VWAP.  The analysis can show historic exposure, prices and volatility against participation rates and other metrics.

The data management and storage technology is a key component that enables this analysis. The technology has to handle consuming the high volume of data across markets and the variety including price content, orders and executions.

Q: How is TCA evolving as trading firms move into executing on multiple exchanges, and asset classes?

A: The elusive search for alpha is a hunt across markets, geographies and controlling costs.  Understanding the implicit transaction costs of implementing an investment idea is essential for the simple act of buying or selling a security.  Increased sophistication in cross-border equity trading is no longer satisfied with leaving FX execution to custodians. Next-generation TCA must apply equally to other asset classes beyond equities. Incorporating currency rates in real-time will indicate if the equity trade is offset by the cost of currency.

Q: How does OneMarketData fit into the TCA challenge?

A: Transaction cost decision analysis leveraged by buy side trading desks can improve their ability to capture alpha.  It is an effective competitive weapon to outpace the market and to improve profit margin behaviour.

OneMarketData provides an analysis and data management platform – OneTick is designed to equip the buy side and asset managers to achieve their profit goals.  OneTick combines an analytics engine, complex event processing and time series (tick) database as a single technology platform.  It is capable of coalescing the differences between exchanges (symbologies, data formats), converting currencies, manage continuous futures contracts, injecting corporation actions and numerous other data management functions.

OneTick has over 100 built-in functions for a wide variety of analytics such as determining accurate price depth, market transparency and liquidity analysis within and across markets with nanosecond precision.  OneTick can capture and track orders and fills both in real-time and historically for profiling trader and broker behavior.  Intra-day execution patterns and market impact can be assessed for executing orders in a more efficient way.

The increased sophistication in trading algorithms crossing into multiple asset classes and spanning the globe has led the way for refinements in transaction cost analysis.  The buy-side, asset managers and quantitative shops are reaching out to the same technology for trading to custom build TCA solutions.  Those solutions allow them to achieve a broad view into algo behavior, their costs and the ability to measure and control them.

Subscribe to our newsletter

Related content


Upcoming Webinar: Best Practices for Building High-Performance Data Infrastructures

Date: 23 June 2022 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes The requirement for high-performance data systems to support trading analytics for hedge funds, high-frequency trading firms and electronic liquidity providers is well established. But the explosion in Big Data over the past several years has expanded the scope of...


19 Best Transaction Reporting Solutions in 2022

Transaction reporting is demanded of financial institutions by multiple regulatory regimes, with the goal of detecting suspected market abuse or money laundering. To comply, firms need to ensure their systems are able to deliver complete and accurate information about the financial instrument traded, the firm undertaking the trade, the buyer and seller, and the date...


Data Management Summit New York

Now in its 12th year, the Data Management Summit (DMS) in New York brings together the North American, capital markets enterprise data management community, to explore the evolution of data strategy and how to leverage data to drive compliance and business insight.


Regulatory Data Handbook 2021/2022 – Ninth Edition

Welcome to the ninth edition of A-Team Group’s Regulatory Data Handbook, a publication dedicated to helping you gain a full understanding of regulations related to your organisation from the details of requirements to best practice implementation. This edition of the handbook includes a focus on regulations being rolled out to bring order and standardisation to...