About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Cloud-Delivered Market Data for Institutional Users – A Reality Check

Subscribe to our newsletter

By Mike O’Hara, Special Correspondent.

Cloud-delivered market data was once ‘over my dead body’ territory for institutional market data managers, who understandably fretted aloud about performance, security and licence compliance issues. But Covid-19 has forced those same data managers to confront the fact that many of their professional market data users are able to work from home (WFH), in turn driving financial firms to question whether the pandemic could be the catalyst for a rethink of their expensive-to-maintain market data infrastructures, with cloud part of the data delivery solution.

For many financial firms, today’s cloud delivery and hosting capabilities offer a viable solution for supporting trading and investment teams and their support staff, accelerating demand for cloud-based market data delivery infrastructures. The thinking is that cloud may help firms with their broader aim of reducing their on-premises technology and equipment footprint, a trend that was emerging even before the Coronavirus struck.

But embracing cloud delivery introduces new challenges for market data and trading technology professionals. While WFH will doubtless continue in some form, it’s far from clear that all market data delivery can be migrated to the cloud. Essential market data functions will remain on-premise. High-performance trading applications and low-latency market data connectivity, for example, will continue to rely on state-of-the-art colocation and proximity hosting data centres.

For many financial institutions, the challenge will be how to manage these several tiers of market data delivery and consumption. Going forward, practitioners will face a three-way hybrid of on-premises, cloud-based (private/public) and collocated market data services in order to support a range of users: from work-from-home traders and support staff to trading-room-based traders, analysts and quants, to collocated electronic applications like algorithms, smart order routers and FIX engines.

Indeed, A-Team will be discussing the infrastructure, connectivity and market data delivery challenges associated with cloud adoption in a webinar panel session on November 3. The webinar will offer a ‘reality check’ that discusses best practices for embracing cloud, colo and on-prem to support this new mix of user types, with emphasis on capacity, orchestration, licensing, entitlements and system / usage monitoring.

With firms’ appetite for exploring the potential of the cloud piqued, data managers are now looking at whether they can hope to take advantage of some of the more widely recognised benefits of the cloud – flexibility, agility, speed-to-market, scalability, elasticity, interoperability and so on – as they grapple with the future market data delivery landscape.

“Market data infrastructure, in terms of data vendor contracts, servers, and data centre space, typically represents a large, lumpy, cap ex expenditure”, says independent consultant Nick Morrison. “And so having the ability to transition that to something with costs that are more elastic, is highly attractive”.

Of course, every firm has its own unique requirements and nuances in this regard. Proprietary trading firms, asset managers, hedge funds, brokers and investment banks are all heavy consumers of market data. But the volume, breadth, depth and speed of the data they need in order to operate is highly diverse. Which means that there is no ‘one size fits all’ when it comes to sourcing and distribution mechanisms (including the cloud).

Market data and the cloud – what’s applicable?

As they consider their options for including cloud in their overall data delivery plans, data managers need to assess whether and how specific data types could be migrated to a hybrid environment: Level 1 (best bid/offer), level 2 (order book with aggregated depth at each price level) or level 3 (full order book)? Historic, end of day, delayed or real-time? Streaming or on-demand? This all has a bearing on the feasibility of cloud as a delivery mechanism.

Firms also need to consider their mix of public and private cloud, or what mix or hybrid cloud solution best fits their needs. What about virtualisation? Or internal use of cloud architecture, such as building a market data infrastructure around microservices and containers?

The marketplace already has identified at least one workable use-case: the use of historical, tick or time-series market data, usually to drive some form of analytics. A growing number of trading venues (such as ICE and CME) and service providers (Refinitiv, BMLL and others) now offer full level 3 tick data on a T+1 basis, delivered via the cloud. Plenty more providers can offer historic level 1 & 2 data.

This kind of capability can be used for critical use-cases, such as back-testing trading models for signal generation and alpha capture, performing transaction cost analysis (TCA), developing and testing smart order routers (SORs), or fine-tuning trading algos to better source liquidity. In all of these cases, cloud-hosted historical tick databases can reduce on-premises footprint and cost, while offering flexible access to vast computing resource on demand, and many are finding this compelling. “When churning through such vast quantities of data, having access to a cloud environment enables you to scale up horizontally to process that data”, says Elliot Banks, Chief Product Officer at BMLL.

Where things start to get more complicated, though, is with real-time market data, where two of the biggest hurdles from a cloud delivery perspective are speed and complexity.

Deterministic speed

From a trading standpoint, speed is always going to be a significant factor. Nobody, regardless of whether they’re an ultra-low latency high-frequency trading firm or a human trader dealing from a vendor or broker screen, wants to trade on stale prices. The tolerances may be different but the principle applies across the board.

It’s a safe bet that any firm currently receiving market data directly from a trading venue into a trading server (collocated at the venue’s data centre or hosted at a specialized proximity hosting centre operated by the likes of Interxion) relies on deterministic low latency, and is therefore unlikely to consider cloud as an alternative delivery mechanism.

Clearly, HFT firms with trading platforms that require microsecond-level data delivery won’t be replacing their direct exchange feeds and often hardware-accelerated infrastructure with the cloud, as the performance just isn’t there, for now at least. This, of course, could change if and when the trading venues themselves migrate to cloud platforms, creating a new kind of colocation environment, but that’s likely some way off. “But these guys only have a few applications that really need ultra-low latency data”, says Bill Fenick, VP Enterprise at Interxion. “Most of their applications, be they middle office, settlements or risk, they’re perfectly happy to take low-millisecond latency”.

And what about other market participants? Particularly those that currently make use of consolidated feeds from market data vendors, where speed is perhaps a secondary consideration? This is where cloud delivery may have some real potential. But it’s also where the issue of complexity rears its head.

Navigating the complexity

To deal with the myriad of sources, delivery frequencies, formats and vendor connections used to feed real-time market data into their trading, risk, pricing and analytics systems, many financial firms have built up a complex mesh of infrastructure that ensures the right data gets delivered to the right place at the right time. The integration layer required to handle these data inputs may be delivered as part of the data service or may stand alone as a discrete entity. In either case, it’s unrealistic to expect that all of this infrastructure can just be stripped out and replicated in a cloud environment.

To address this challenge, some service providers are starting to offer solutions where the source of the data is decoupled from the distribution mechanism, aiming for the holy grail where either, or both, can be cloud-based.

By building individual cloud-hosted microservices for sourcing market data, processing that data in a variety of ways, and delivering it into end-user applications, such solutions can help firms migrate their market data infrastructure incrementally from legacy to cloud-based platforms. Refinitiv is starting to shift much of its infrastructure onto AWS, and other specialist cloud-centric vendors such as Xignite and BCC Group also enable internal systems to be decoupled from data sources, thus facilitating a shift towards cloud-based infrastructure. “We believe the customer should be able to easily move from source to source and get as many sources as they want. The cloud enables this kind of flexibility”, says Bill Bierds, President & Chief Business Development Officer at BCC Group.

Firms have long wanted to become more vendor-agnostic by decoupling their data integration capability from the primary data source. One investment bank in London, for example, was able to decouple Refinitiv’s TREP platform from its Elektron data feed and switch to Bloomberg’s B-Pipe for its data, delivered via the TREP framework. From a market data perspective, this has given the bank more negotiating power and less vendor lock-in, opening up greater opportunities to utilise cloud-based market data sources in the future.

Permissioning and entitlements

Perhaps one of the toughest challenges that firms face around real-time market data on the cloud is that of entitlements and usage authorisation. Firms sourcing data from the two main data vendors, Refinitiv and Bloomberg, will generally be tied into their respective DACS and EMRS entitlements systems, often augmented by data inventory and contract management platforms like MDSL’s MDM or TRG Screen’s FITS and InfoMatch.

Entitlements can be a thorny subject when it comes to cloud-based distribution of market data. Firms are wary of falling foul of their licence agreements with their various data vendors, all of whom have different commercial considerations and penalties for non-compliance. This is why accurate tracking and reporting of market data access and usage is crucial.

The cloud can be a double-edged sword in this regard. One the one hand, transitioning from a dedicated infrastructure to the cloud might trigger extra licensing costs for what is effectively an additional data centre, so they may need to go through a period of paying twice for the same data. Indeed, firms may already be facing this situation as they entitle staff to operate from home while holding enterprise licences covering only their headquarters and regional offices.

On the other hand, cloud-based services such as those offered by Xignite and others can make it easier for firms to manage entitlements across multiple data vendors from a central source via a UI. “Our entitlements microservice is integrated with our real time microservice, to make sure that any distribution and any consumption of data is authenticated and entitled properly, so that only the right users have access to the data,” says Stephane Dubois, CEO of Xignite, whose microservices suite is supporting NICE Actimize’s cloud-based market data delivery infrastructure.

Where next?

With new products, services and technologies emerging all the time, firms can be optimistic about the growing opportunities that the cloud can offer for managing market data. One particularly interesting development worth watching is the rise of Low Code Application Platforms (LCAPs), such as that offered by Genesis, which provides a cloud-based microservices framework that can be used for rapidly developing and delivering applications around real-time market data. One example is on-demand margining. “A prime broker can link to all of its customers and know exactly what their risk positions are based on real-time market data, so within minutes, they can be sending out margin calls”, says Felipe Oliviera, Head of Sales and Marketing at Genesis.

Industry behemoths such as Refinitiv, SIX and FactSet are also embracing the cloud. Refinitiv has now launched delivery of market data via AWS, is making its tick history data available on Google Cloud and has also recently announced a partnership with Microsoft Azure. FactSet has launched a cloud-based ticker plant on Amazon EC2. And SIX is partnering with Xignite for real-time market data delivery via the cloud. Bloomberg is also partnering with AWS to make its B-Pipe data feed available through the cloud. And the main cloud vendors themselves – Amazon, Google and Microsoft – have established dedicated teams to develop these markets

In conclusion, it’s clear that there are a number of challenges that firms still face when transitioning any part of their market data infrastructure to the cloud. (To register for A-Team’s free webinar on the topic, click here.) And in many cases, particularly where ultra-low latency is required, cloud is not the answer. But equally, by migrating certain elements of their market data infrastructure to the cloud, cost savings can be achieved, efficiencies can be gained and firms can potentially do more with less.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Trade the Middle East & North Africa: Connectivity, Data Systems & Processes

In Partnership With Date: 20 May 2024 Time: 11am London / 1pm Egypt & Saudi Arabia / 2pm United Arab Emirates / 6am CET Duration: 50 minutes As key states across the region seek alternatives to the fossil fuel industries that have driven their economies for decades, pioneering financial centres are emerging in Egypt, United...

BLOG

Meritsoft Unveils Advanced Platform for Post-Trade Process Automation

Meritsoft, a subsidiary of Cognizant that specialises in post-trade process automation, has launched its next-generation technology platform offering advanced post-trade process automation capabilities. Built using the latest technologies, the new platform aims to deliver operational efficiency gains and cost optimisation opportunities by better supporting firms’ post-trade processing requirements across fails management, financial transaction taxes (FTTs),...

EVENT

TradingTech Briefing New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

The Global LEI System – A Solution for Entity Data?

The Global LEI System – or GLEIS – has been in development since the middle of last year. Development has been patchy at times, but much has been done, leaving fewer outstanding issues, but also raising new questions. What’s emerging is a structure for the GLEIS going forward, complete with a mechanism for registering and...