About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Q&A: OneMarketData’s Louis Lovas on TCA, Latency and Big Data

Subscribe to our newsletter

Transaction Cost Analysis (TCA) is of increasing importance to trading success and while it is very much a business imperative, implementing it and putting it to effective use relies on a combination of low-latency technologies and big data approaches. IntelligentTradingTechnology.com got the TCA low down from Louis Lovas, director of solutions at OneMarketData.

Q: To start, can you briefly explain what Transaction Cost Analysis is, and why it’s increasingly of interest to trading firms?

A: Transaction Cost Analysis is the idea of monitoring and reporting on trade performance.  It is certainly not a new concept, buy side firms have long been leveraging broker TCA services to analyse their executions and to optimise equity order flow.

Yet with tighter spreads, thinner margins and a lower risk appetite, trading firms are seeking ways to find and preserve alpha.  Trading opportunities in new asset classes such as FX and Futures along with cross asset models are being designed and deployed.  And as such managing and analysing trade costs is of paramount importance.

A goal of TCA is to achieve a seamless analysis of executions across all tradable markets and provide recommendations that are actionable to strategy behavior.  This requires profiling order flow in the traditional (historical) sense measuring the performance of individual brokers and venues and also monitoring strategies in real-time.  Execution analysis is also a tool for liquidity analysis highlighting value and toxicity across fragmented markets.

Q: What does a trading firm require in order to implement TCA?

A: TCA wants to accurately measure costs for the purpose of executing orders in a more efficient way.  It will also profile trader and broker behavior – looking at intra-day execution efficiencies and highlighting the impact of delays on strategy performance. Including implicit costs or slippage – measured as implementation shortfall, opportunity costs such as crossing the spread and of course explicit costs, commissions and transaction fees.

TCA requires a number of key technology components; an analytical framework that can provide value across time continuums; historic, intra-day and real-time to profile market conditions and participants behaviors.  That framework requires a broad array of analytical functions for analysing order flow, executions and market structure.

To support the analytical framework and seamlessly thread historic and real-time data for TCA analysis requires a high performance processing engine such as Complex Event Processing.  CEP can semantically combine numerous data sources – markets and execution venues – as the bedrock for complex TCA computations and provide integration to trading systems for feedback into strategy logic.

For the capture and storage of the vast volume of data, a time series database is the final major component needed for a capable TCA infrastructure.

Lastly, is the face of TCA.  Here, visual design tooling to create graphical representations of execution patterns, fills and liquidity; they make the data intuitively obvious.

Q: And how does TCA impact – or have relevance to – low-latency technologies?

A: Achieving alpha in today’s competitive trading landscape can be a complex recipe.  It includes low latency data delivery, fast execution technology and analysis tools against a fragmented market structure.  And determining exactly where to execute at any given moment has become a key driver for best execution.  This includes the generation of real-time alerts on market conditions fed back into strategy logic so they can be modified mid-day.

TCA can and should monitor order executions in real-time to provide advice to traders and strategy logic.  The continuous analysis of executing orders, their realised and unrealised P&L, can indicate in real-time the implicit opportunity costs.

Q: How do big data approaches support TCA?

A: Fire hose volumes exist across all markets.  This is only exacerbated by the hunt for alpha stretching across multiple asset classes and geographies.   In finance, there is a paramount need for reliability and accuracy, it is what distinguishes financial big data from all others.

Price accuracy of the individual constituents and transparency across a multiplicity of markets is a mandate, be it a stock symbol, currency pair or futures contract captured across the broad expanse of time from the past, the present and projected into the future.

TCA as post-trade exercise compares actual execution performance against a variety of benchmarks such as VWAP.  The analysis can show historic exposure, prices and volatility against participation rates and other metrics.

The data management and storage technology is a key component that enables this analysis. The technology has to handle consuming the high volume of data across markets and the variety including price content, orders and executions.

Q: How is TCA evolving as trading firms move into executing on multiple exchanges, and asset classes?

A: The elusive search for alpha is a hunt across markets, geographies and controlling costs.  Understanding the implicit transaction costs of implementing an investment idea is essential for the simple act of buying or selling a security.  Increased sophistication in cross-border equity trading is no longer satisfied with leaving FX execution to custodians. Next-generation TCA must apply equally to other asset classes beyond equities. Incorporating currency rates in real-time will indicate if the equity trade is offset by the cost of currency.

Q: How does OneMarketData fit into the TCA challenge?

A: Transaction cost decision analysis leveraged by buy side trading desks can improve their ability to capture alpha.  It is an effective competitive weapon to outpace the market and to improve profit margin behaviour.

OneMarketData provides an analysis and data management platform – OneTick is designed to equip the buy side and asset managers to achieve their profit goals.  OneTick combines an analytics engine, complex event processing and time series (tick) database as a single technology platform.  It is capable of coalescing the differences between exchanges (symbologies, data formats), converting currencies, manage continuous futures contracts, injecting corporation actions and numerous other data management functions.

OneTick has over 100 built-in functions for a wide variety of analytics such as determining accurate price depth, market transparency and liquidity analysis within and across markets with nanosecond precision.  OneTick can capture and track orders and fills both in real-time and historically for profiling trader and broker behavior.  Intra-day execution patterns and market impact can be assessed for executing orders in a more efficient way.

The increased sophistication in trading algorithms crossing into multiple asset classes and spanning the globe has led the way for refinements in transaction cost analysis.  The buy-side, asset managers and quantitative shops are reaching out to the same technology for trading to custom build TCA solutions.  Those solutions allow them to achieve a broad view into algo behavior, their costs and the ability to measure and control them.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Trade South Africa: Considerations for Connecting to and Trading the Johannesburg Markets

Interest among the international institutional community in trading South African markets is on the rise. With connectivity, data and analytics options for trading on the Johannesburg Stock Exchange growing more sophisticated, and the emergence of A2X as a credible alternative equity market, South Africa is shaping up as a financial centre that can offer a...

BLOG

Why Traders, Risk and Compliance Need to Be On the Same Data Page

As market participants continue to look for ways to manage risk and operate effectively in an increasingly complex energy trading landscape, Stanislav Ermilov (CEO of Tallarium) explains why data harmonisation is essential for creating efficient, transparent, and compliant markets. The structure of energy trading is dramatically evolving . The industry is going through a major...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...