The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Why the Finance Industry Needs Traceable Timing Across All its Operations

By Richard Hoptroff, Founder and CTO of Hoptroff London.

As the virtual world expands, so do the applications for highly accurate, traceable time. Its most established industry is finance. If clocks on servers drift and are untraceable back to a verified source of universal time (UTC), conflicts can arise. Orders can look as if they arrived before they were sent, and there is no way to determine whose time was correct. In light of these problems, regulators globally have become truly aware of the importance of accurate, synchronised time. MiFID II and CAT regulations stipulate that the clocks on servers must not vary from UTC by less than a millisecond, and in some cases by less than 100 microseconds. This regulatory requirement goes a long way to preventing the disputes mentioned above.

It is vital that accurate timing is also employed by banks and traders when communicating their news. When not just seconds, but milliseconds count, it is important that announcements are synchronised with one another to a high degree of accuracy. Applying the same level of accuracy to communications as is applied to the trades themselves ensures an even playing field on the stock market – if all media feeds are synchronised, no one party can have an advantage during a sensitive announcement.

Compliance and transparency

Compliance with MiFID II and CAT requires that all financial market participants have accurate synchronized timing on their trading servers. This time feed must be of sufficient accuracy and granularity to enable all events to be uniquely, and traceably, identified in a causal sequence, to enable accurate reconstruction of events after the fact. If the clocks on different systems drift to different degrees, then sequences of events that appear in the logs will seem to appear in random order. Timestamps from drifting clocks will make them appear earlier or later in the sequence than was actually the case and the latencies between events will not be measured accurately and different events will be assigned the same timestamp. When a single delivery chain might involve many servers and that action be repeated thousands of times, then it is clear how lack of synchronization will compromise the usefulness of timestamps when you are trying to reconstruct a chain of automated activities after an event.

Traceability is the process through which a timestamp can prove it is correct by showing an unbroken chain of comparisons from the time stamp back through the synchronization reference points back to the primary time source. If two PTP (Precision Time Protocol) installations each have different timing records, the only way to resolve the variances is by checking the traceability of each system. Synchronized timing requires traceability so that the timestamps it produces are authoritative and can be used to resolve timing disputes with other parties. More broadly, knowing exactly when events happened allows us to detect their likely causes by examining the events immediately prior. For instance, in financial markets, the causes of flash crashes can be identified. Synchronized timing is a vital component in the process of maximizing the efficiency of automated systems and monitoring them so they can transparently account for their actions.

Unreliable timing = low trust

Our world is increasingly governed by events that happen on computers. The only record of what happened is the data they create. Yet that data cannot be trusted because it may be incorrect or may have been manipulated.

Regulations such as MiFID II and CAT recognise the need for trustable data. In order to establish causality in cascading financial events, they require that computers’ clocks are synchronised to UTC, in some cases to within 100 microseconds. At least where causality applies, it quickly becomes clear when data is wrong because it ceases to make sense.

When an event happens on a computer that will have an effect in the real world, it should be recorded in such a way that can be audited to confirm when and where the digital event happened. This needs to go right down to the edge, which might be a consumer’s web browser or an IoT device whose data is used as evidence in court. Without such a system, the records created have no support. It must be possible to audit the digital business world in as much detail as accountants audit the physical business world today.

Related content

WEBINAR

Recorded Webinar: Managing LIBOR transition

The clock is ticking on the decommissioning of LIBOR towards the end of this year, leaving financial institutions with little time to identify where LIBOR is embedded in their processes, assess alternative benchmarks and reference rates, and ensure a smooth transition. Some will be ahead of others, but the challenges are significant for all as...

BLOG

Brexit: Equivalence is Dead. What Now?

Last week, UK Chancellor Rishi Sunak delivered his first speech at Mansion House, outlining his vision for the UK’s financial services industry. During the speech, while stating that the UK government’s ambition following Brexit had been to reach a comprehensive set of mutual decisions on financial services equivalence, he conceded that “this has not happened,”...

EVENT

RegTech Summit APAC Virtual

RegTech Summit APAC will explore the current regulatory environment in Asia Pacific, the impact of COVID on the RegTech industry and the extent to which the pandemic has acted a catalyst for RegTech adoption in financial markets.

GUIDE

ESG Handbook 2021

A-Team Group’s ESG Handbook 2021 is a ‘must read’ for all capital markets participants, data vendors and solutions providers involved in Environmental, Social and Governance (ESG) investing and product development. It includes extensive coverage of all elements of ESG, from an initial definition and why ESG is important, to existing and emerging regulations, data challenges...