About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

The Growing Adoption of Time-as-a-Service in Electronic Financial Markets

Subscribe to our newsletter

In electronic financial markets, time plays a critical role. The speed at which market data is processed, orders are generated and routed, and trades are executed, is today measured in milliseconds, microseconds and – in some cases – nanoseconds.

This has led to the need for more granular measurement of time, more precision in the time stamping of events, and more accuracy in the synchronisation of clocks across servers used by firms and venues when trading in multiple global locations, in order to determine the exact sequence of what happened where and when.

From a regulatory perspective, the accurate measurement of time has become increasingly important since MiFID II was introduced in Europe in 2018, which set out specific standards for time stamping accuracy; within 100 microseconds high-frequency trading (HFT), within one millisecond for standard electronic trades, and within one second for voice trades. In addition, the directive states that every time stamp must be traceable back to Coordinated Universal Time (UCT).

However, complying with regulations is not the only reason why accurate time stamping has become increasingly important in today’s fast-moving, data-driven markets. For HFT firms, a lot can happen in 100 microseconds, and even for ‘standard electronic trading’, a millisecond can be a long time. Many firms are therefore constantly striving to improve the accuracy, precision and traceability of the time sources they use, and are looking at how this might be achieved in the simplest, most cost-effective way.

Rather than trying to do everything themselves, can ‘Time as a Service’ (TaaS) offerings help firms achieve these goals, particularly in cloud-based and distributed environments? If so, what are some of the use cases for TaaS? From a practical perspective, how do firms deploy and integrate such services within their trading infrastructure? And how might TaaS evolve as banks, brokers, trading firms, exchanges and infrastructure providers move towards the cloud?

What is driving the demand for TaaS?

“There’s been a steady increase in interest in different things that you can do with time over the past five or six years, particularly since MiFID II was introduced,” says Micah Kroeze, Chief Product Officer at Options Technology, a provider of cloud-enabled managed services to the global capital markets. “At the time, it had the most stringent requirements, as US-based regulations for time compliance weren’t yet to that same level of granularity and accuracy. Once MiFID II came out, all of a sudden people began paying more attention.”

One area of particular concern to both regulators and market participants is the sector’s reliance on GPS as the source for time.

“Traditionally, people would have an antenna on the roof of the data centre connected to one of the satellite systems, usually GPS,” says Tim Richards, CEO of Hoptroff, a technology company that delivers software-based timing solutions. “And there’s a concern that GPS and GNSS satellite groups are becoming more and more vulnerable, through jamming, spoofing, hacking, and increasingly space weather. Different governments and organisations are starting to get more and more strident about their concerns around GNSS satellite vulnerabilities. In the US, executive order 13905 – issued in 2020 – states that if you operate critical infrastructure and rely on GPS/GNSS for your timing signal, you must find a GNSS-independent alternative, so this is an area where risk officers are starting to get involved because of the obvious concerns.”

Richards explains that Hoptroff’s Traceable Time as a Service (TTaaS) offering addresses these concerns by delivering a SaaS-based solution that uses multiple timing sources. “We send a time feed to a customer’s preferred location (data centre, cloud, or on-prem, the service is agnostic), which consists of at least of four different sources, which may include three GNSS timing sources, as well as a terrestrial time source, which might be something like NIST in the US, or RISE in Europe. The fact that we have that number of time sources builds resilience into the time feed and makes it much more robust than relying on an antenna on the roof pointing at one satellite. And it’s not localised, it’s available anywhere that is connected to our time feed.”

There are other benefits that a services-based solution offers besides resilience, says Leon Lobo, Head of the National Timing Centre at the UK’s National Physical Laboratory (NPL).

“One thing we’ve learned is that there is a significant sparsity of skills in this area, and in the run up to MiFID II we saw a whole range of operations and skill sets being pushed towards services,” he says. “From a timing perspective, rather than manage and maintain capability in house, it is much simpler to access a service provided into the infrastructure, and effectively hand off that responsibility to the service provider. One key requirement for GNSS is around ease of access and addressing challenges around roof access for GPS for example, and as such there’s a drive towards accessing time over the network. Those services need to deliver time as a signal into the trading infrastructure with traceability to UTC and with resilience embedded for the organisation to have confidence that they will always have the level that they require. That’s been a key driver and as a result, there’s been more and more services developed, with lots of added value being provided, in addition to just the time provision itself.”

TaaS use cases

In the wake of regulations such as MiFID II, TaaS offerings have become more widely adopted across the market. But besides regulatory compliance, what are some of the other use cases?

“Any system in the trading lifecycle needs time services,” says Kroeze. “Whether you’re an exchange running a matching engine, whether you are a broker validating trades coming in from your underlying clients and sending those out to the exchanges, or whether you’re running a trading platform yourself, every single step of the way you need to be compliant to time, both from a regulatory perspective and internally. When you are executing a trade, you want to know for sure that you are executing on the best possible market information, so you need to be in sync with the exchange or with other third parties whose information you’re receiving as a trigger to place a trade. You need to know that the time they see is the same as the time you see, so you know exactly how much time has elapsed between receiving the data and placing the trade, and to know that you’re in sync. At any point of that trading lifecycle, as with any specialist technology, there are varying levels of expertise and operational rigour and costs that come into play when you deploy it. Frequently, it makes a lot more sense to go to a specialist third party that provides time services, who you know is going to be configuring that time network correctly and is going to be able to provide a very high-quality time service and time source that is ultimately cheaper and more stable and simpler than doing it yourself.”

Other use cases revolve around low-latency networks, says Lobo, particularly in a distributed environment. “In order to provide the required service level around low latency, the underpinning capabilities are all around the measurement of the time, in order to manage and maintain that. Timing and synchronisation has always been essential, but even more so now for the digital infrastructure that supports all these platforms. Because if those are not synchronised at a level that makes sense for the applications, the digital infrastructure falls apart. All of that is underpinned by timing and synchronisation.

“One area that is developing rapidly, and not just in the finance sector, is around distributed ledgers & blockchains,” he continues. “Having an assured capability around timestamps adds an entirely new layer of trust. A lot of work has been done around the security, the authentication and the immutability of distributed ledgers. But the timestamp and the time applied provides an entirely new layer that not many organisations have yet considered.”

Integrating TaaS into the trading infrastructure

For firms looking to utilise Time as a Service, what is involved from an implementation standpoint?

“It’s relatively straightforward to do. It depends on the time service that you’re taking,” says Kroeze. “With NTP (Network Time Protocol), everything tends to be configured to allow NTP flow as a baseline. PTP (Precision Time Protocol) is where you have a more interesting integration point. There are only a couple things that you really need to keep in mind when you are connecting to Time as a Service. One is that you want your own network to be PTP-aware. So if you’re connecting to a TaaS time source once, you’re distributing that time from there downstream to multiple devices. You want to be running that through switches that are built specifically to measure and record the transit time of PTP packets to ensure your downstream clocks adjust for transit delay correctly. As soon as you have PTP packets in a queue with other traffic, you lose control of the accuracy, so you need the right equipment. Obviously if your equipment is being hosted by someone like Options, you don’t need to worry about it, because we take care of that piece for you. But when you take TaaS into your own network, you need the right equipment. Every single hop from the point that you take handoff to the server needs to be some form of boundary clock or transparent clock for PTP. If you have more than one switch, each switch needs to be configured as a transparent clock at minimum. Ideally, you want to have your perimeter configured as a boundary clock and then internally transparent clocks from there down to the underlying server. On the servers themselves, you ideally want a network card that’s capable of processing PTP packets, which would typically have its own onboard clock to synchronise to the PTP source and to the CPU clock. It sounds complex, but it’s not that many config points. You do need to make sure you get it right, however.”

What does the future hold for TaaS?

As banks, brokers, trading firms, exchanges and infrastructure providers move more of their operations and infrastructure towards the cloud, how might Time as a Service evolve?

“What financial services businesses don’t want to do is have isolated islands of activity, they want to make sure that everything is synchronised as they move away from owning tin in data centres and move more into the virtual world, so I expect to see a growth in the scale of the estates that people want to synchronise,” says Richards. “Being able to take snapshot views of your estate, having a global footprint or a regional footprint with data centres in many different locations, all fully synchronised and all very clearly visible in terms of their performance is really attractive, so I expect things are going to continue to grow in that way. And as their estates expand and spread around the world, it’s very important for firms to be able to have the capability to synchronise those.”

However, the cloud does raise some – as yet – unanswered questions, points out Kroeze. “As financial services in general is beginning to expand more and more to public clouds, there’s a desire to have a high level of time accuracy within public clouds as well,” he says. “But there are significant challenges around running PTP effectively in public clouds. It’s something that each major cloud provider is working on addressing, so I expect it’s something they will ultimately solve for themselves directly because it’s a such a fundamental part of not just financial services but any sort of precision-based industry.”

Lobo concludes by setting out NPL’s vision. “Our vision for the UK moving forward, is that time is delivered as a utility, as widely available, assured services, with all the trust and confidence at the point of provision and with all the metrics behind it, so that the user can just use it wherever they need to, at the level they need, everything being taken care of in the background. NPL will be managing the measurement standard, the national time scale, and we are going to be enabling industry to deliver new services. A lot of the discussions we’re having now with service providers is essentially to help them develop and deliver new forms of dissemination service, whether it’s fibre in the ground, over the internet, new RF broadcast systems, or mobile platforms, even including low earth orbit (LEO) constellations. We’re providing the source into all these different dissemination methods not only to embed traceability, but also because we see the route to resiliency is through diversity of solutions with different failure modes. From that perspective, what we see coming is firms being able to access time in multiple locations, across different services, and by virtue of the embedded traceability, fully synchronised at that level, regardless of the access mechanism, such as RF broadcast in one location and fibre in another.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Leveraging cloud as part of your colocation or proximity hosting strategy

As firms assess which elements of the trading workflow can be profitably migrated to cloud environments, ensuring compatibility with colocation and proximity hosting facilities becomes paramount. Practitioners need to take into account data access and security, messaging latency and regulatory considerations around operational resilience and physical location of client data. This webinar will discuss best...

BLOG

Pico Secures $200 Million Investment from Golden Gate Capital

Pico, provider of technology services, software, data and analytics for the financial markets community, has signed an agreement with Golden Gate Capital, the private equity investment firm, for a $200 million strategic investment. Pico will use the capital to pursue strategic M&A opportunities, and to extend its data offering and market coverage across all regions...

EVENT

Data Management Summit New York

Now in its 12th year, the Data Management Summit (DMS) in New York brings together the North American, capital markets enterprise data management community, to explore the evolution of data strategy and how to leverage data to drive compliance and business insight.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...