The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

MiFID II: Putting the Clock Back

The type of trader you are will determine how you set your watch. That is the upshot of ‘RTS 25: Draft regulatory technical standards on clock synchronization’ published by the European Securities and Markets Commission in late September under the revision of the Markets in Financial Instruments Directive (MiFID II).

“MiFID II moved away from defining the level of traceability required for market participants by the gateway-to-gateway latency, which they had in an earlier consultation paper, to [basing it] purely on the type of trading,” says Dr Leon Lobo, strategic business development manager for Time & Frequency at the National Physical Laboratory.

The directive is intended to standardise trading behaviour across Europe, working within a framework that emphasises transparency and risk mitigation. Ultimately a consistent set of rules will allow Europe to become a single marketplace with all of the advantages in efficiency and cost reduction seen across other large markets such as the US and China. These rules will be transposed by a competent national authority with the level of variation depending upon the leeway in the text. In the case of clock synchronisation there is little room for manoeuvre.

Establishing synchronisation is needed to ensure that timestamped records of market activity are consistent and can be used when retrospectively examining an event or period to determine whether there has been abusive or erroneous action by any market participant. They apply both to the market operators and the trading firms themselves.

What and when

Synchronisation is not required under existing European rules, where in the US under rules enforced by the Financial Industry Regulatory Authority (FINRA) market operators have to check against an atomic clock at the National Institute of Standards and Technology (NIST).

MiFID II says that both operators of the trading venues and the market participants have to set up their clock traceability to Coordinated Universal Time (UTC), the international standardised time standard. Following a comprehensive documentation of their systems and processes they have to identify the “exact point at which a timestamp is applied and demonstrate that the point within the system where the timestamp is applied remains consistent”, with annual compliance checks.

“Regulators are really blindfolded up to the implementation of these rules,” says Jock Percy, CEO of Perseus, a provider of low latency trading technology. “That is because – and I am not assuming for a second that market participants would hide anything – if you or I had to produce a set of records at a certain frequency we are not going to go to any extra time and effort when compliance costs are already very high, to voluntarily put extra detail in when we do not need to.”

The granularity of the data required by the clock is set out by the RTS. For market operators it depends on the gateway-to-gateway latency time of their trading system – anything over 1 millisecond requiring granularity of one millisecond or better with a maximum divergence from UTC of one millisecond, while anything less than a millisecond requiring granularity of one microsecond or better, diverging by < 100 microseconds.

For market participant using “high frequency algorithmic trading technique” which in the Level 1 text constituted:

a) infrastructure intended to minimise network and other types of latencies, including at least one of the following facilities for algorithmic order entry: co-location, proximity hosting or high-speed direct electronic access

b) system-determination of order initiation, generation, routing or execution without human intervention for individual trades or orders, and

c) high message intraday rates which constitute orders, quotes or cancellations

Firms will be required to be within 100 microseconds of UTC and to have granularity of 1 microsecond or better. This increases to one second for voice, auction or request for quote (RFQ) systems and sits at one millisecond for everything else – capturing other electronic execution for example.

This is far from the highest level of granularity says David Snowdon, founder and co-chief technology officer of low latency hardware provider Metamako, “We were quite prepared to work at a stricter tolerance than one microsecond because one microsecond isn’t really enough to disambiguate from defined ordering,” he says. “If high-frequency traders and people that use technology like ours are looking for tens of nanoseconds of improvement in their latency. They think that makes difference to them. The fact that we are not able to measure that in a timestamp is a bit problematic.”

Technical challenge

To achieve the level of detail requested by ESMA will not be easy for every firm says Lobo. “Both of the 100 microsecond- and the millisecond-level firms will have to upgrade infrastructure if they are not ready for that level,” he warns. ”The reason for that is as a millisecond you cannot achieve that with network time protocol which the majority of firms use at the moment.

The operating system being used will also affect capability. For example time stamping in Windows could make timestamping even at the one-second-level a challenge, Lobo observes. Moving toward LINUX systems offers some improvement while timestamping using hardware will be the more reliable option. To evaluate its capability a firm can assess any degradation in synchronicity at each step in its infrastructure if it knows the level of the capability of the source it is feeding into the system and it is very stable.

“You need to know what is coming in and the stability of what’s coming in to understand what you are getting further down the line, because otherwise you are just comparing against effectively an unknown,” says Lobo.

Although the requirements will certainly provide a clearer picture of the market than has existed before, there will still need to be details ironed out in order to ensure that regulators can genuinely track the fastest potential perpetrators of unwanted activity.

Per Loven, head of International Corporate Strategy at block-trading market operator Liquidnet Europe says, “One challenge is that a high-frequency trading firm will use the infrastructure of other firms to trade into the market and so they would need to be identified within that other firms trading pattern.”

There is also a need for more international coordination highly electronic markets that are completely international, such as the listed derivatives markets. FINRA is proposing to reduce the divergence of US markets from the NIST clock from one second down to the 50 milliseconds.

“I believe the whole world would be better off if we all had the same standard such as UTC,” said Percy.

Related content

WEBINAR

Recorded Webinar: Managing LIBOR transition

The clock is ticking on the decommissioning of LIBOR towards the end of this year, leaving financial institutions with little time to identify where LIBOR is embedded in their processes, assess alternative benchmarks and reference rates, and ensure a smooth transition. Some will be ahead of others, but the challenges are significant for all as...

BLOG

VoxSmart Raises $25 Million with Toscafund to Finance Expansion and Growth

In a deal claimed to be one of the largest fundraises conducted for the RegTech and Compliance sector, London-based communications surveillance provider VoxSmart has raised $25 million to strengthen its markets surveillance technology suite and to fund further expansion, particularly in North America and emerging markets. Following $13.7 million raised in earlier rounds of Series...

EVENT

RegTech Summit APAC Virtual

RegTech Summit APAC will explore the current regulatory environment in Asia Pacific, the impact of COVID on the RegTech industry and the extent to which the pandemic has acted a catalyst for RegTech adoption in financial markets.

GUIDE

The Global LEI System – A Solution for Entity Data?

The Global LEI System – or GLEIS – has been in development since the middle of last year. Development has been patchy at times, but much has been done, leaving fewer outstanding issues, but also raising new questions. What’s emerging is a structure for the GLEIS going forward, complete with a mechanism for registering and...