About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Time-stamping Needs To Be Better Than Regulators Require, Providers Say

Subscribe to our newsletter

Although the requirement for time-stamping set by Europe’s MiFID II regulation and for the US Consolidated Audit Trail (CAT) is within 100 microseconds of the recognised standard UTC time, in practice firms and exchanges will need their reporting to be accurate down to nanoseconds or just a few microseconds, according to time-stamping services providers.

“High-frequency traders and high-performance traders are transacting in substantially less than 1 microsecond. 100 microseconds is three orders of magnitude out on the level of accuracy of time-stamping,” says David Snowdon, chief technology officer of Metamako. “In a market like Nasdaq, if the response time is 80 microseconds, a trader could place an order, receive a response back, place another order and get another response back, and place a third order. On 100 microsecond timestamp accuracy all three of those could have the same time-stamp, totally legally. The 100 microsecond limit is nowhere near good enough to provide an idea of what order the events happened in the market.”

The communications protocols used for financial industry trade reporting, such as precision time protocol (PTP) and network time protocol (NTP) can achieve accuracy, and therefore time-stamping, in nanoseconds, according to Snowdon. This is achievable on transmissions between points as far apart as New York and London, he adds.

Accurate time-stamping is important to operations as well as compliance, states Heiko Gerstung, managing director at Meinberg, a German company that makes electronic clocks capable of nanosecond-level accuracy, for use in industries. “Customers want to be able to correlate stamps from different systems with each other to get information,” he says. “The easiest is measuring the time it takes from one transaction being generated to being received by another system, and then to be forwarded to the exchange.”

Meinberg’s capabilities include detection of what parts of a network are slower than others, which allows users to replace or improve parts of their network, and therefore reduce the overall latency of their solution. As Snowdon says, delays happening in fiber cables can make time synchronisation challenging. Regarding MiFID II’s 100 microsecond standard, Gerstung adds, “You have to make sure the clocks are well below 100 microseconds of divergence from UTC. If this is not the case, you are not complying with the regulation, in our opinion.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Navigating a Complex World: Best Data Practices in Sanctions Screening

As rising geopolitical uncertainty prompts an intensification in the complexity and volume of global economic and financial sanctions, banks and financial institutions are faced with a daunting set of new compliance challenges. The risk of inadvertently engaging with sanctioned securities has never been higher and the penalties for doing so are harsh. Traditional sanctions screening...

BLOG

MiFIR Schema 1.4.0 Rollout: Testing Clarity Still Pending – April Deadline Remains

As of mid-February 2026, the European Securities and Markets Authority’s (ESMA) MiFIR reporting webpage continues to indicate that a dedicated test environment for updated transparency messages would open in February, with exact dates to be confirmed in January. No detailed testing calendar has been published at the time of writing. The result is a compressed...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Hosted/Managed Services

The on-site data management model is broken. Resources have been squeezed to breaking point. The industry needs a new operating model if it is truly to do more with less. Can hosted/managed services provide the answer? Can the marketplace really create and maintain a utility-based approach to reference data management? And if so, how can...