About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Time Stamps Key To Market Data Performance, TCA Provider Tells Webinar

Subscribe to our newsletter

To get the highest performance possible from managing market data operations, especially connectivity, a deterministic approach may not be the only one that should be taken, observed Louis Lovas, director of solutions at OneMarketData, who spoke in a December 8 webinar sponsored by the transaction cost analysis provider, and hosted by Intelligent Trading Technology and A-Team Group.

“If you consider from the consuming applications for data that data quality not only includes the idea of determinate behavior, but also elements that are more natural to the applications for firms that are trading across markets, that’s where you want the consistency in symbols and symbol continuity across market centers,” Lovas said.

Separating the parts of a transaction can affect how data consumers get information about that transaction, observed Mark Reece, director of professional services at MCO Europe, a financial data processing technology provider. “Depending on what you’re trying to do, you may impact the latency of your whole market data system,” he said. “Trades might impact or orders might be impacted by bursts of arrivals of market data. Similarly, if you’re a high-frequency market maker who is doing moving in futures and options, a single ticket underlying may result in a firestorm of outgoing quote updates from you. So you need to be very careful about separating those parts.”

Latency goals in relation to market data operations performance can vary, according to Ted Hruzd, senior infrastructure architect at RBC. “The goals of prop traders, market makers, high-frequency traders and arbitragers are much more latency sensitive,” he said. “Equities and futures traders are more apt to opt for ultra-low-latency. FX is lagging behind bond and commodity trading, but there have been some recent advances in electronic trading for corporate bonds.”

To best support transaction cost analysis or market impact analysis, time-stamp precision and synchronization is “vital,” said Lovas, “particularly for cross-market price discovery, or consolidation or aggregation across markets.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Navigating the Build vs Buy Dilemma: Cloud Strategies for Accelerating Quantitative Research

Date: 20 May 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes For many quantitative trading firms and asset managers, building a self-provisioned historical market data environment remains one of the most time-consuming and resource-intensive steps in establishing a new research capability. Sourcing data, normalising symbologies, handling corporate actions and maintaining...

BLOG

TXSE Selects Exegy FPGA Technology for Market Data Infrastructure

The Texas Stock Exchange (TXSE) has selected Exegy to provide FPGA-based market data feed handlers as part of its launch infrastructure. TXSE is positioning itself as the first fully integrated U.S. equities exchange built from scratch in more than 25 years. As part of that ground-up approach, the venue has opted to deploy FPGA technology...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Complex Event Processing

Over the past couple of years, Complex Event Processing has emerged as a hot technology for the financial markets, and its flexibility has been leveraged in applications as diverse as market data cleansing, to algorithmic trading, to compliance monitoring, to risk management. CEP is a solution to many problems, which is one reason why the...