About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Time Stamps Key To Market Data Performance, TCA Provider Tells Webinar

Subscribe to our newsletter

To get the highest performance possible from managing market data operations, especially connectivity, a deterministic approach may not be the only one that should be taken, observed Louis Lovas, director of solutions at OneMarketData, who spoke in a December 8 webinar sponsored by the transaction cost analysis provider, and hosted by Intelligent Trading Technology and A-Team Group.

“If you consider from the consuming applications for data that data quality not only includes the idea of determinate behavior, but also elements that are more natural to the applications for firms that are trading across markets, that’s where you want the consistency in symbols and symbol continuity across market centers,” Lovas said.

Separating the parts of a transaction can affect how data consumers get information about that transaction, observed Mark Reece, director of professional services at MCO Europe, a financial data processing technology provider. “Depending on what you’re trying to do, you may impact the latency of your whole market data system,” he said. “Trades might impact or orders might be impacted by bursts of arrivals of market data. Similarly, if you’re a high-frequency market maker who is doing moving in futures and options, a single ticket underlying may result in a firestorm of outgoing quote updates from you. So you need to be very careful about separating those parts.”

Latency goals in relation to market data operations performance can vary, according to Ted Hruzd, senior infrastructure architect at RBC. “The goals of prop traders, market makers, high-frequency traders and arbitragers are much more latency sensitive,” he said. “Equities and futures traders are more apt to opt for ultra-low-latency. FX is lagging behind bond and commodity trading, but there have been some recent advances in electronic trading for corporate bonds.”

To best support transaction cost analysis or market impact analysis, time-stamp precision and synchronization is “vital,” said Lovas, “particularly for cross-market price discovery, or consolidation or aggregation across markets.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Data platform modernisation: Best practice approaches for unifying data, real time data and automated processing

Date: 17 March 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Financial institutions are evolving their data platform modernisation programmes, moving beyond data-for-cloud capabilities and increasingly towards artificial intelligence-readiness. This has shifted the data management focus in the direction of data unification, real-time delivery and automated governance. The drivers of...

BLOG

BMLL Set for “Supercharged” Growth Following Nordic Capital Acquisition

Nordic Capital has announced its acquisition of BMLL, the Level 3 historical market data and analytics provider. The investment, made in partnership with BMLL’s management team and minority shareholder Optiver, is set to accelerate the company’s growth and expand its global footprint. While the financial terms of the deal have not been officially disclosed, industry...

EVENT

AI in Capital Markets Summit London

Now in its 3rd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

Entity Data Management Handbook – Third Edition

Welcome to the third edition of the Entity Data Management Handbook which is available for free download. In this updated edition we delve into the role entity data plays in the smooth running of financial institutions and capital markets, the challenges of attaining high quality data, and various aspects, approaches and technologies involved in managing...