About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Time-stamping Needs To Be Better Than Regulators Require, Providers Say

Subscribe to our newsletter

Although the requirement for time-stamping set by Europe’s MiFID II regulation and for the US Consolidated Audit Trail (CAT) is within 100 microseconds of the recognised standard UTC time, in practice firms and exchanges will need their reporting to be accurate down to nanoseconds or just a few microseconds, according to time-stamping services providers.

“High-frequency traders and high-performance traders are transacting in substantially less than 1 microsecond. 100 microseconds is three orders of magnitude out on the level of accuracy of time-stamping,” says David Snowdon, chief technology officer of Metamako. “In a market like Nasdaq, if the response time is 80 microseconds, a trader could place an order, receive a response back, place another order and get another response back, and place a third order. On 100 microsecond timestamp accuracy all three of those could have the same time-stamp, totally legally. The 100 microsecond limit is nowhere near good enough to provide an idea of what order the events happened in the market.”

The communications protocols used for financial industry trade reporting, such as precision time protocol (PTP) and network time protocol (NTP) can achieve accuracy, and therefore time-stamping, in nanoseconds, according to Snowdon. This is achievable on transmissions between points as far apart as New York and London, he adds.

Accurate time-stamping is important to operations as well as compliance, states Heiko Gerstung, managing director at Meinberg, a German company that makes electronic clocks capable of nanosecond-level accuracy, for use in industries. “Customers want to be able to correlate stamps from different systems with each other to get information,” he says. “The easiest is measuring the time it takes from one transaction being generated to being received by another system, and then to be forwarded to the exchange.”

Meinberg’s capabilities include detection of what parts of a network are slower than others, which allows users to replace or improve parts of their network, and therefore reduce the overall latency of their solution. As Snowdon says, delays happening in fiber cables can make time synchronisation challenging. Regarding MiFID II’s 100 microsecond standard, Gerstung adds, “You have to make sure the clocks are well below 100 microseconds of divergence from UTC. If this is not the case, you are not complying with the regulation, in our opinion.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Reviewing the Latency Landscape and the Next Generation of Ultra-Low Latency Infrastructure

Date: 17 September 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Ultra-low latency is no longer the preserve of a handful of proprietary trading firms. As new asset classes electronify, data volumes surge, and regulatory expectations around execution quality and resilience tighten, the performance demands on trading infrastructure are broadening...

BLOG

AI Agents Need Better Data, Not Bigger Models – Daloopa Benchmark

AI-powered fundamental and historical data provider Daloopa has published new benchmark research examining how well leading AI agent systems perform on real-world financial research tasks. Titled Benchmarking AI Agents on Financial Retrieval, the study evaluates whether recent advances in agentic AI translate into reliable outcomes when accuracy matters most. The benchmark focuses on a core...

EVENT

ExchangeTech Summit London

A-Team Group, organisers of the TradingTech Summits, are pleased to announce the inaugural ExchangeTech Summit London on May 14th 2026. This dedicated forum brings together operators of exchanges, alternative execution venues and digital asset platforms with the ecosystem of vendors driving the future of matching engines, surveillance and market access.

GUIDE

Enterprise Data Management

The current financial crisis has highlighted that financial institutions do not have a sufficient handle on their data and has prompted many of these institutions to re-evaluate their approaches to data management. Moreover, the increased regulatory scrutiny of the financial services community during the past year has meant that data management has become a key...