About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Why the Finance Industry Needs Traceable Timing Across All its Operations

Subscribe to our newsletter

By Richard Hoptroff, Founder and CTO of Hoptroff London.

As the virtual world expands, so do the applications for highly accurate, traceable time. Its most established industry is finance. If clocks on servers drift and are untraceable back to a verified source of universal time (UTC), conflicts can arise. Orders can look as if they arrived before they were sent, and there is no way to determine whose time was correct. In light of these problems, regulators globally have become truly aware of the importance of accurate, synchronised time. MiFID II and CAT regulations stipulate that the clocks on servers must not vary from UTC by less than a millisecond, and in some cases by less than 100 microseconds. This regulatory requirement goes a long way to preventing the disputes mentioned above.

It is vital that accurate timing is also employed by banks and traders when communicating their news. When not just seconds, but milliseconds count, it is important that announcements are synchronised with one another to a high degree of accuracy. Applying the same level of accuracy to communications as is applied to the trades themselves ensures an even playing field on the stock market – if all media feeds are synchronised, no one party can have an advantage during a sensitive announcement.

Compliance and transparency

Compliance with MiFID II and CAT requires that all financial market participants have accurate synchronized timing on their trading servers. This time feed must be of sufficient accuracy and granularity to enable all events to be uniquely, and traceably, identified in a causal sequence, to enable accurate reconstruction of events after the fact. If the clocks on different systems drift to different degrees, then sequences of events that appear in the logs will seem to appear in random order. Timestamps from drifting clocks will make them appear earlier or later in the sequence than was actually the case and the latencies between events will not be measured accurately and different events will be assigned the same timestamp. When a single delivery chain might involve many servers and that action be repeated thousands of times, then it is clear how lack of synchronization will compromise the usefulness of timestamps when you are trying to reconstruct a chain of automated activities after an event.

Traceability is the process through which a timestamp can prove it is correct by showing an unbroken chain of comparisons from the time stamp back through the synchronization reference points back to the primary time source. If two PTP (Precision Time Protocol) installations each have different timing records, the only way to resolve the variances is by checking the traceability of each system. Synchronized timing requires traceability so that the timestamps it produces are authoritative and can be used to resolve timing disputes with other parties. More broadly, knowing exactly when events happened allows us to detect their likely causes by examining the events immediately prior. For instance, in financial markets, the causes of flash crashes can be identified. Synchronized timing is a vital component in the process of maximizing the efficiency of automated systems and monitoring them so they can transparently account for their actions.

Unreliable timing = low trust

Our world is increasingly governed by events that happen on computers. The only record of what happened is the data they create. Yet that data cannot be trusted because it may be incorrect or may have been manipulated.

Regulations such as MiFID II and CAT recognise the need for trustable data. In order to establish causality in cascading financial events, they require that computers’ clocks are synchronised to UTC, in some cases to within 100 microseconds. At least where causality applies, it quickly becomes clear when data is wrong because it ceases to make sense.

When an event happens on a computer that will have an effect in the real world, it should be recorded in such a way that can be audited to confirm when and where the digital event happened. This needs to go right down to the edge, which might be a consumer’s web browser or an IoT device whose data is used as evidence in court. Without such a system, the records created have no support. It must be possible to audit the digital business world in as much detail as accountants audit the physical business world today.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Leveraging interoperability: Laying the foundations for unique best-of-breed trading solutions

Date: 23 May 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Interoperability on the trading desk promises more actionable insights, real-time decision making, faster workflows and reduced errors by ensuring data consistency across frequently used applications. But how can these promises be kept in an environment characterised by multiple applications...

BLOG

Symphony Integrates Google Cloud’s AI for Enhanced Financial Markets Voice Analytics

Symphony, the financial markets infrastructure and technology platform, is set to enhance its voice analytics capabilities by integrating Google Cloud’s generative artificial intelligence (gen AI) and transcription technologies. This development, building upon Symphony’s existing strategic partnership with Google Cloud as its primary cloud provider, aims to address the unique challenges of voice transcription in the...

EVENT

RegTech Summit New York

Now in its 8th year, the RegTech Summit in New York will bring together the regtech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...