About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Time-stamping Needs To Be Better Than Regulators Require, Providers Say

Subscribe to our newsletter

Although the requirement for time-stamping set by Europe’s MiFID II regulation and for the US Consolidated Audit Trail (CAT) is within 100 microseconds of the recognised standard UTC time, in practice firms and exchanges will need their reporting to be accurate down to nanoseconds or just a few microseconds, according to time-stamping services providers.

“High-frequency traders and high-performance traders are transacting in substantially less than 1 microsecond. 100 microseconds is three orders of magnitude out on the level of accuracy of time-stamping,” says David Snowdon, chief technology officer of Metamako. “In a market like Nasdaq, if the response time is 80 microseconds, a trader could place an order, receive a response back, place another order and get another response back, and place a third order. On 100 microsecond timestamp accuracy all three of those could have the same time-stamp, totally legally. The 100 microsecond limit is nowhere near good enough to provide an idea of what order the events happened in the market.”

The communications protocols used for financial industry trade reporting, such as precision time protocol (PTP) and network time protocol (NTP) can achieve accuracy, and therefore time-stamping, in nanoseconds, according to Snowdon. This is achievable on transmissions between points as far apart as New York and London, he adds.

Accurate time-stamping is important to operations as well as compliance, states Heiko Gerstung, managing director at Meinberg, a German company that makes electronic clocks capable of nanosecond-level accuracy, for use in industries. “Customers want to be able to correlate stamps from different systems with each other to get information,” he says. “The easiest is measuring the time it takes from one transaction being generated to being received by another system, and then to be forwarded to the exchange.”

Meinberg’s capabilities include detection of what parts of a network are slower than others, which allows users to replace or improve parts of their network, and therefore reduce the overall latency of their solution. As Snowdon says, delays happening in fiber cables can make time synchronisation challenging. Regarding MiFID II’s 100 microsecond standard, Gerstung adds, “You have to make sure the clocks are well below 100 microseconds of divergence from UTC. If this is not the case, you are not complying with the regulation, in our opinion.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Navigating the Build vs Buy Dilemma: Cloud Strategies for Accelerating Quantitative Research

Date: 20 May 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes For many quantitative trading firms and asset managers, building a self-provisioned historical market data environment remains one of the most time-consuming and resource-intensive steps in establishing a new research capability. Sourcing data, normalising symbologies, handling corporate actions and maintaining...

BLOG

ESMA’s “Data Day” and Regulatory Digitalisation

When ESMA convened its first ‘Data Day’ on 2 December 2025, the agenda title – “Burden reduction in the digitalisation era” – captured a shift that has been building across Europe’s regulatory landscape for several years. While markets been advancing shared data models and machine-executable reporting logic through initiatives such as the Common Domain Model...

EVENT

ExchangeTech Summit London

A-Team Group, organisers of the TradingTech Summits, are pleased to announce the inaugural ExchangeTech Summit London on May 14th 2026. This dedicated forum brings together operators of exchanges, alternative execution venues and digital asset platforms with the ecosystem of vendors driving the future of matching engines, surveillance and market access.

GUIDE

Complex Event Processing

Over the past couple of years, Complex Event Processing has emerged as a hot technology for the financial markets, and its flexibility has been leveraged in applications as diverse as market data cleansing, to algorithmic trading, to compliance monitoring, to risk management. CEP is a solution to many problems, which is one reason why the...