About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

MiFID II: Putting the Clock Back

Subscribe to our newsletter

The type of trader you are will determine how you set your watch. That is the upshot of ‘RTS 25: Draft regulatory technical standards on clock synchronization’ published by the European Securities and Markets Commission in late September under the revision of the Markets in Financial Instruments Directive (MiFID II).

“MiFID II moved away from defining the level of traceability required for market participants by the gateway-to-gateway latency, which they had in an earlier consultation paper, to [basing it] purely on the type of trading,” says Dr Leon Lobo, strategic business development manager for Time & Frequency at the National Physical Laboratory.

The directive is intended to standardise trading behaviour across Europe, working within a framework that emphasises transparency and risk mitigation. Ultimately a consistent set of rules will allow Europe to become a single marketplace with all of the advantages in efficiency and cost reduction seen across other large markets such as the US and China. These rules will be transposed by a competent national authority with the level of variation depending upon the leeway in the text. In the case of clock synchronisation there is little room for manoeuvre.

Establishing synchronisation is needed to ensure that timestamped records of market activity are consistent and can be used when retrospectively examining an event or period to determine whether there has been abusive or erroneous action by any market participant. They apply both to the market operators and the trading firms themselves.

What and when

Synchronisation is not required under existing European rules, where in the US under rules enforced by the Financial Industry Regulatory Authority (FINRA) market operators have to check against an atomic clock at the National Institute of Standards and Technology (NIST).

MiFID II says that both operators of the trading venues and the market participants have to set up their clock traceability to Coordinated Universal Time (UTC), the international standardised time standard. Following a comprehensive documentation of their systems and processes they have to identify the “exact point at which a timestamp is applied and demonstrate that the point within the system where the timestamp is applied remains consistent”, with annual compliance checks.

“Regulators are really blindfolded up to the implementation of these rules,” says Jock Percy, CEO of Perseus, a provider of low latency trading technology. “That is because – and I am not assuming for a second that market participants would hide anything – if you or I had to produce a set of records at a certain frequency we are not going to go to any extra time and effort when compliance costs are already very high, to voluntarily put extra detail in when we do not need to.”

The granularity of the data required by the clock is set out by the RTS. For market operators it depends on the gateway-to-gateway latency time of their trading system – anything over 1 millisecond requiring granularity of one millisecond or better with a maximum divergence from UTC of one millisecond, while anything less than a millisecond requiring granularity of one microsecond or better, diverging by < 100 microseconds.

For market participant using “high frequency algorithmic trading technique” which in the Level 1 text constituted:

a) infrastructure intended to minimise network and other types of latencies, including at least one of the following facilities for algorithmic order entry: co-location, proximity hosting or high-speed direct electronic access

b) system-determination of order initiation, generation, routing or execution without human intervention for individual trades or orders, and

c) high message intraday rates which constitute orders, quotes or cancellations

Firms will be required to be within 100 microseconds of UTC and to have granularity of 1 microsecond or better. This increases to one second for voice, auction or request for quote (RFQ) systems and sits at one millisecond for everything else – capturing other electronic execution for example.

This is far from the highest level of granularity says David Snowdon, founder and co-chief technology officer of low latency hardware provider Metamako, “We were quite prepared to work at a stricter tolerance than one microsecond because one microsecond isn’t really enough to disambiguate from defined ordering,” he says. “If high-frequency traders and people that use technology like ours are looking for tens of nanoseconds of improvement in their latency. They think that makes difference to them. The fact that we are not able to measure that in a timestamp is a bit problematic.”

Technical challenge

To achieve the level of detail requested by ESMA will not be easy for every firm says Lobo. “Both of the 100 microsecond- and the millisecond-level firms will have to upgrade infrastructure if they are not ready for that level,” he warns. ”The reason for that is as a millisecond you cannot achieve that with network time protocol which the majority of firms use at the moment.

The operating system being used will also affect capability. For example time stamping in Windows could make timestamping even at the one-second-level a challenge, Lobo observes. Moving toward LINUX systems offers some improvement while timestamping using hardware will be the more reliable option. To evaluate its capability a firm can assess any degradation in synchronicity at each step in its infrastructure if it knows the level of the capability of the source it is feeding into the system and it is very stable.

“You need to know what is coming in and the stability of what’s coming in to understand what you are getting further down the line, because otherwise you are just comparing against effectively an unknown,” says Lobo.

Although the requirements will certainly provide a clearer picture of the market than has existed before, there will still need to be details ironed out in order to ensure that regulators can genuinely track the fastest potential perpetrators of unwanted activity.

Per Loven, head of International Corporate Strategy at block-trading market operator Liquidnet Europe says, “One challenge is that a high-frequency trading firm will use the infrastructure of other firms to trade into the market and so they would need to be identified within that other firms trading pattern.”

There is also a need for more international coordination highly electronic markets that are completely international, such as the listed derivatives markets. FINRA is proposing to reduce the divergence of US markets from the NIST clock from one second down to the 50 milliseconds.

“I believe the whole world would be better off if we all had the same standard such as UTC,” said Percy.

Subscribe to our newsletter

Related content


Recorded Webinar: ESG data sourcing and management to meet your ESG strategy, objectives and timeline

ESG data plays a key role in research, fund product development, fund selection, asset selection, performance tracking, and client and regulatory reporting, yet it is not always easy to source and manage in a complete, transparent and timely manner. This webinar will review the state-of-play on ESG data, consider the challenges of sourcing and managing...


Trading Tech Front and Centre at AFCM Conference in Doha

Last week witnessed an impressive gathering of exchange CEOs and senior market and technology practitioners from across the Middle East and North Africa (MENA) at the Arab Federation of Capital Markets’ (AFCM) annual conference. At the event – this year in Doha, Qatar, and hosted by AFCM general secretary Rami El-Dokany – leaders met to...


AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.


Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...