A-Team Insight Blogs

Share article

The type of trader you are will determine how you set your watch. That is the upshot of ‘RTS 25: Draft regulatory technical standards on clock synchronization’ published by the European Securities and Markets Commission in late September under the revision of the Markets in Financial Instruments Directive (MiFID II).

“MiFID II moved away from defining the level of traceability required for market participants by the gateway-to-gateway latency, which they had in an earlier consultation paper, to [basing it] purely on the type of trading,” says Dr Leon Lobo, strategic business development manager for Time & Frequency at the National Physical Laboratory.

The directive is intended to standardise trading behaviour across Europe, working within a framework that emphasises transparency and risk mitigation. Ultimately a consistent set of rules will allow Europe to become a single marketplace with all of the advantages in efficiency and cost reduction seen across other large markets such as the US and China. These rules will be transposed by a competent national authority with the level of variation depending upon the leeway in the text. In the case of clock synchronisation there is little room for manoeuvre.

Establishing synchronisation is needed to ensure that timestamped records of market activity are consistent and can be used when retrospectively examining an event or period to determine whether there has been abusive or erroneous action by any market participant. They apply both to the market operators and the trading firms themselves.

What and when

Synchronisation is not required under existing European rules, where in the US under rules enforced by the Financial Industry Regulatory Authority (FINRA) market operators have to check against an atomic clock at the National Institute of Standards and Technology (NIST).

MiFID II says that both operators of the trading venues and the market participants have to set up their clock traceability to Coordinated Universal Time (UTC), the international standardised time standard. Following a comprehensive documentation of their systems and processes they have to identify the “exact point at which a timestamp is applied and demonstrate that the point within the system where the timestamp is applied remains consistent”, with annual compliance checks.

“Regulators are really blindfolded up to the implementation of these rules,” says Jock Percy, CEO of Perseus, a provider of low latency trading technology. “That is because – and I am not assuming for a second that market participants would hide anything – if you or I had to produce a set of records at a certain frequency we are not going to go to any extra time and effort when compliance costs are already very high, to voluntarily put extra detail in when we do not need to.”

The granularity of the data required by the clock is set out by the RTS. For market operators it depends on the gateway-to-gateway latency time of their trading system – anything over 1 millisecond requiring granularity of one millisecond or better with a maximum divergence from UTC of one millisecond, while anything less than a millisecond requiring granularity of one microsecond or better, diverging by < 100 microseconds.

For market participant using “high frequency algorithmic trading technique” which in the Level 1 text constituted:

a) infrastructure intended to minimise network and other types of latencies, including at least one of the following facilities for algorithmic order entry: co-location, proximity hosting or high-speed direct electronic access

b) system-determination of order initiation, generation, routing or execution without human intervention for individual trades or orders, and

c) high message intraday rates which constitute orders, quotes or cancellations

Firms will be required to be within 100 microseconds of UTC and to have granularity of 1 microsecond or better. This increases to one second for voice, auction or request for quote (RFQ) systems and sits at one millisecond for everything else – capturing other electronic execution for example.

This is far from the highest level of granularity says David Snowdon, founder and co-chief technology officer of low latency hardware provider Metamako, “We were quite prepared to work at a stricter tolerance than one microsecond because one microsecond isn’t really enough to disambiguate from defined ordering,” he says. “If high-frequency traders and people that use technology like ours are looking for tens of nanoseconds of improvement in their latency. They think that makes difference to them. The fact that we are not able to measure that in a timestamp is a bit problematic.”

Technical challenge

To achieve the level of detail requested by ESMA will not be easy for every firm says Lobo. “Both of the 100 microsecond- and the millisecond-level firms will have to upgrade infrastructure if they are not ready for that level,” he warns. ”The reason for that is as a millisecond you cannot achieve that with network time protocol which the majority of firms use at the moment.

The operating system being used will also affect capability. For example time stamping in Windows could make timestamping even at the one-second-level a challenge, Lobo observes. Moving toward LINUX systems offers some improvement while timestamping using hardware will be the more reliable option. To evaluate its capability a firm can assess any degradation in synchronicity at each step in its infrastructure if it knows the level of the capability of the source it is feeding into the system and it is very stable.

“You need to know what is coming in and the stability of what’s coming in to understand what you are getting further down the line, because otherwise you are just comparing against effectively an unknown,” says Lobo.

Although the requirements will certainly provide a clearer picture of the market than has existed before, there will still need to be details ironed out in order to ensure that regulators can genuinely track the fastest potential perpetrators of unwanted activity.

Per Loven, head of International Corporate Strategy at block-trading market operator Liquidnet Europe says, “One challenge is that a high-frequency trading firm will use the infrastructure of other firms to trade into the market and so they would need to be identified within that other firms trading pattern.”

There is also a need for more international coordination highly electronic markets that are completely international, such as the listed derivatives markets. FINRA is proposing to reduce the divergence of US markets from the NIST clock from one second down to the 50 milliseconds.

“I believe the whole world would be better off if we all had the same standard such as UTC,” said Percy.

Leave a comment

Your email address will not be published. Required fields are marked *


Related content


Recorded Webinar: Overcoming the Barriers to Implementing RegTech Solutions: The View from Either Side of the Fence

RegTech holds the promise of targeted, agile and often low-cost solutions to the real-world problems faced by financial institutions across the board. So why is it so difficult to get RegTech projects off the ground? RegTech solutions providers complain that it’s difficult to get access to decision-makers, and even when they do it’s tough to...


Options Sets Targets Following Investment from Private Equity Firm Abry Partners

Options Technology, a provider of cloud-enabled managed services, plans to accelerate growth strategy, pursue strategic M&A targets, invest in its technology platform and expand its reach to key global financial centres following ‘significant growth investment’ from Boston-based private equity firm Abry Partners. Details of the investment were not disclosed. Founded in 1989, Abry is a...


TradingTech Summit London

The TradingTech Summit in London brings together European senior-level decision makers in trading technology, electronic execution and trading architecture to discuss how firms can use high performance technologies to optimise trading in the new regulatory environment.


What the Global Legal Entity Identifier (LEI) Will Mean for Your Firm

It’s hard to believe that as early as the 2009 Group of 20 summit in Pittsburgh the industry had recognised the need for greater transparency as part of a wider package of reforms aimed at mitigating the systemic risk posed by the OTC derivatives market. That realisation ultimately led to the Dodd Frank Act, and...