About a-team Marketing Services

A-Team Insight Blogs

The Current and Future Landscape of Real-Time Enterprise Market Data Distribution

Subscribe to our newsletter

Distributing real-time market data across the enterprise is a complex proposition, with many moving parts. Banks and other financial institutions spend vast sums to ensure that the correct market data, the lifeblood of their trading activities, is sourced, normalised, controlled and distributed out to end users and applications in the timeliest manner, with minimal downtime.

Today, there are few vendors that offer end-to-end platforms for full enterprise market data distribution. The segment has long been dominated by what’s now known as the Real-Time Distribution System (RTDS) from Refinitiv, the former Thomson Reuters and now part of London Stock Exchange Group. RTDS was more famously known as the Thomson Reuters Enterprise Platform, or TREP, having evolved over the years from its previous incarnation, the Reuters Market Data System (RMDS), which itself grew out of the company’s Triarch architecture, originally launched in the ‘80s and later augmented by Tibco Software’s Rendezvous (RV) messaging middleware.

Given the heritage – and age – of such platforms, how deeply embedded have they become within financial institutions today? How much of a priority is it for firms to modernise their legacy market data infrastructures, and what is involved in that? How have incumbent vendors themselves been updating their platforms to stay competitive in today’s markets, and how do they stack up against newer offerings from challenger vendors? And what can the cloud offer?

Embedded platforms

These are all pertinent questions when discussing enterprise market data platforms, but it’s important to consider the whole picture, says Mike Powell, CEO of trading technology vendor Rapid Addition, and formerly head of enterprise at Thomson Reuters/Refinitiv.

What people don’t talk about so much is the content creation and publishing capability of platforms such as TREP”, he says. “When we look at market data, we tend to think about data being consumed from an exchange or venue, or maybe contributed data from banks or other sources. But there’s a significant volume of valued-added pricing data, yield curves, risk numbers, analytics, and so on that are created internally by, for instance, a bank. So you need a mechanism to publish that data internally, control which applications consume it across the organisation, and publish externally from the platform to the bank’s clients, counterparties, third party liquidity platforms, and so on. That internal creation and publication of value-added data is just as important a feature of such platforms as the ingestion and normalisation of external market data, if not more so.”

Another senior alumnus of Refinitiv’s enterprise business, Terry Roche, who is Co-Founder and CEO at Pegasus Enterprise Solutions, a challenger platform vendor that aims to transform market data services, gives his own take on the current landscape. “The industry has been controlled by both platform and data lock-in. We are assisting firms in unlocking their dependence on proprietary data and technology to create flexibility and independence that enables data sourcing in a manner beneficial to and determined by the customer.”

He continues, “Enterprise market data platforms are deeply embedded. Historically, it’s been exceptionally difficult to displace them. The cause was more to do with the lack of efficient APIs to enable migration. Pegasus’ MarketsIO API Suite, which is a multi-language, multi-platform API, provides front office development teams a unified way to integrate to all their platforms in an efficient way. Once a client converts an application to our APIs they then can change platforms with a simple configuration change, to realise significant efficiencies and flexibility.”

Mark Heckert, Chief Product Officer, Fixed Income and Data Services at Intercontinental Exchange (ICE), agrees that when incumbent platforms have been around for many years, they become deeply embedded as the backbone for distribution within the firm. “Particularly for data around real time price discovery,” he says. “Market data platforms have various characteristics that need to be considered if they’re to be replaced. The platform might come laden with a given symbology set, for example, which could be populated throughout the organisation. If you were to take out that platform, you’d need to be aware of all the downstream applications expecting that symbology and make the necessary changes. Also, there may be an entitlements engine that needs to know which downstream users are entitled for the various datasets.”

Legacy architecture

With the ageing technology underpinning some of these platforms, how much of a priority do firms put on replacing – or at least modernising – them, given how dependent firms have become on them?

“Replacing an enterprise market data platform is not a priority until you hit capacity or you’re changing your architecture,” says the CEO of a leading enterprise data platform. “If you start seeing that you’re reaching any type of capacity, if you can’t move things in real time because the protocols are too old, or the system can’t cope with the volumes that you have, then that’s when you need to look at changing systems.”

Craig Schachter, Chief Revenue Officer at market data and trading technology specialist Exegy Inc., which last year acquired platform provider Vela, believes that firms are absolutely looking to remove legacy infrastructure across the board. “Not just to have a modern technology stack for front, middle and back office, but to allow them to evolve and continue to keep up with the market, and stay competitive,” he says.

“To rid yourself of legacy technology,” Schachter says, “you have to have integrations or adapters, so you can keep that infrastructure in place while migrating to newer technologies. Once you build that bridge and give clients access to the data that they need, either for the low latency stuff or for a shared infrastructure that is not as latency sensitive, you provide a path forward. And eventually, you can shut off some of those older pieces of infrastructure in more of a managed process.”

The costs – and associated risks – of maintaining these legacy systems are also major factors, adds Schachter. Having to maintain something that is 30 or 40 years old is incredibly costly, not just in terms of the infrastructure in place, but also the people needed to support it, because the new engineers coming out of school don’t necessarily have those skills. Then there’s the additional inherent risk of allowing older infrastructure to be retained in their organisations. If newer technology goes down today, it’s pretty quick to have redundancy in place, so if there’s a blip somewhere, that’s okay. It’s not as painful as being out of the market for minutes, hours or longer, which is the risk with older infrastructure.”

Medan Gabbay, Chief Revenue Officer at multi-asset OMS/EMS trading technology provider Quod Financial, counters that there are also costs and risks involved when replacing embedded platforms. “If you’re going to change your market data platform, you then have the knock-on impact of having to ensure all your other systems, your OMS, your EMS, your back office, and so on, are all supported,” he says. “The impact of the change can be potentially quite damaging if all your trading systems break because you’re not supporting the correct codes, for example. You can’t make a change like that without huge internal project costs and it’s often not worth the risk. Very few are willing to pull the trigger, which is why these platforms have been in place for decades.”

Incumbent vendors are not resting on their laurels however, and are certainly aware that customer needs are changing. “The appetite for the traditional model of having a consolidated feed plus managing multiple feed handlers, everything on-prem, supported by large teams requiring niche subject matter expertise, and relying on a large hardware footprint to cascade data across your ecosystem seems to be fading, versus the choice of running the application wherever and connecting to the data directly,” says Cory Albert, Global Head of Cloud Strategy, Enterprise Data at Bloomberg. “People want to run things in the cloud. It’s just easier to get the data directly to the application with that model, especially when you can do so without compromising speed, depth/breadth, or reliability.

Modernisation

As new challenger platforms appear and start to take hold, how are the established vendors modernising their existing platforms to stay competitive and address the needs of their customers for the next 10, 20 or 30 years?

“With so many firms relying on our technology, we need to stay abreast of what they are experiencing, and what the industry themes and trends are that impact them,” says Matt Eddy, Head of Enterprise Integration Proposition at Refinitiv. “What are the latest toolkits that they use or the latest authentication platforms? How can we make sure that we’re adopting the standards that they need us to, so that we can address their needs? It’s a great responsibility to be powering all these different firms, but it’s a very transformational time at the moment. It’s a really exciting time in the industry. We’re seeing so much more WebSocket connectivity coming in because it’s just easy, lightweight to do and it gets up and running really quickly. So supporting the new APIs and the new ways of coding is a necessity.”

Jason West, Director, Real-Time Managed Distribution Services Business at Refinitiv, points out some of the benefits that moving towards a more modern architecture offers. “When CIOs and CTOs understand that the componentry on premise today has been containerized and is available in a cloud environment, they start pushing their developers to go that way. And the developers are happy because they’d rather code to something that’s either Azure, AWS or Google, because they understand the products associated with those CSPs. They don’t have to go through the internal coding process anymore and the sprints are a lot quicker. When the business asks for something, instead of it involving capex and taking nine months, now it can be developed in a sandbox in AWS for example, and spun up as an MVP or POC within days.”

Exegy’s Schachter stresses that it is important for incumbent vendors to provide their customers with a clear migration path. “They need to do that with as little friction as possible if they want to keep their products sticky for their customers,” he says. “Stickiness is all about the value that the customers are getting, regardless of whether the infrastructure is new or old. But in reality, most of it will need to be newer, because the newer technology enables customers to evolve and provide more value to their own end users and end customers.”

“If we look at the challenger firms that are rising, they often start from some kind of fintech niche,” says ICE’s Heckert. “Those fintechs might have an interesting data set, but often it’s not particularly well integrated into the consumer’s workflow. Also, it can be hard for consumers to link that to their broader data set. So if there are new data needs in the marketplace, vendors like ourselves have to be observant of those and make sure they’re integrated well with our data model, so the clients can consume it within the context of their broader data needs. We also have to make sure that we’re investing in newer data access methodologies, whether that be cloud hosted databases or modern APIs, to ensure that we’re at least as nimble as these fintechs.”

Flexibility and transparency are also key, adds Bloomberg’s Albert. “If you have a lot of integration options, if you’re flexible with your distribution technology and provide licencing models that are both transparent and aligned with distribution flexibility, that’s important for customers, especially the big banks. It’s something that they should be demanding of their market data providers, to be transparent and offer technology options that allow for integration with opportunities with their preferred technologies. We try to be as open as possible with our technology solutions, by offering APIs and open symbology while integrating with many different vendors across capital markets. And openness is only going to continue to gain importance and traction, because as customers look towards the cloud and other technologies to refresh existing platforms or provide best of breed application environments and potentially perform technology refreshes, decisions are going to be made around the flexibility and transparency of the vendor.”

Roche stresses however that vendors’ APIs have to be up to the task. “The legacy APIs provided in the platform space have been very difficult to use, requiring significant work for a client to be able to deploy a new service or make a change,” he says. “We’ve built our API suites with the front office developer in mind, so that front office development teams can have one very easy to use and powerful interface to all the services that they’re delivering. That gives them more time to develop things for the front office, as opposed to worrying about managing and integrating across multiple platforms. We give clients the source code to our API suite, which they can now use to gain significant developer efficiencies and switch platforms with a simple configuration change.”

For established vendors, there are additional factors to consider when taking a more open approach, says the CEO of an enterprise data platform. “If you built your own protocol 20 or 30 years ago and it all works, but now you need to modernise because the customer wants to receive the data directly to the cloud, or you need to be able to scale, or you want cost efficiencies and the certainty that whatever happens your system will be able to grow, then you have to go to more open technologies,” he says. “As a vendor, if you do have to modernise your technology, the biggest challenge – because it’s not 1990 anymore – is you have to look at your solutions and make them more open. But as soon as you do that, you have the potential to lose the technical hold that you have over your customers.”

For challenger firms, the issue of control and entitlements also raises its head, points out Rapid Addition’s Powell. “Vendors might come up with a good API that is faster, more efficient, and simpler to use, but if they don’t offer all of the necessary control capabilities or certified reporting mechanism to report usage, the third-party data source may make the commercial assumption that all staff within a firm touch the data and charge accordingly. That control element is essential in terms of controlling data access and managing market data costs.”

Controlling costs

This area of control and entitlements, i.e. who or what is using the data, and how is it being used, is a key aspect of an enterprise market data platform, and can have a significant impact on data costs, says Roel Mels, Global Head of Marketing at enterprise subscription management specialists TRG Screen. “Market data optimisation starts with knowing the full picture of your firm’s data landscape and ensuring the right data gets to the people who really need it,” he says. “A lack of transparency around data usage can also lead to firms inadvertently breaching compliance and copyright/distribution agreements, which naturally has significant financial and reputational implications.”

Heckert agrees. “You really need to understand not just where, but how the data is being used. The how is important, because depending how you use data, sometimes it’s a very static and predictable process, where you have the same portfolio day in and day out for example, but there are other use cases where the use of data is much more ad-hoc and driven by a particular situation. In those instances, firms can run into uncontrolled costs because they didn’t realise they had that type of use case and the commercial model wasn’t well structured for that. Once you know all this, you can think about how to effectively structure your relationships with your technology platform providers and your data vendors.”

“One of the risks of having multiple third parties is the cost of data misuse, which is potentially huge, if you don’t know exactly what you’ve got and where it’s being used,” adds Gabbay. “So there’s almost an incentive that even if the monthly price is slightly higher, if you keep your access controls with your data providers, and you’re not using any third party sourcing, then at least you know you’re not in breach.”

With market data costs often running into eye-watering sums at large financial institutions, it is essential for firms to get a handle on this whole area, points out Mels. “What does spend tell you if you don’t have the actual usage stats behind it? Licensed products and services can be unused, underused or underutilised. One needs to track and act upon that – they may qualify for removal, sharing or moving to a more cost-effective alternative. If you can’t see what you’re spending and the usage patterns of it, you’re driving blind. If you don’t know how much you’re spending on market data, with whom you are spending, who is consuming it and the usage patterns around it, how on earth can you manage it? Indeed, different areas of the business sometimes even buy or subscribe to the same or similar market data and feeds, oblivious to whatever else is going on in other parts of the business – scenarios that can be remedied through greater levels of transparency across the enterprise.”

Leveraging the cloud

Where does the cloud fit into all of this? As firms move more and more workloads to the cloud, is enterprise market data distribution now following a similar path? There are certainly benefits to be gained, but also challenges that need to be overcome, as Exegy’s Schachter highlights. “From a cloud perspective, being able to scale up and down as needed, at fractions of the price or cost of the past, is a game changer,” he says. “Having something that can either reside natively in the cloud or that can leverage the cloud is really going to drive down costs. The real challenge with the cloud is moving large quantities of latency sensitive data around. That’s a key issue that still needs to be figured out.”

Streaming data such as prices with low latency is important, but so is fast API access to all data regardless of update frequency,” contends Bloomberg’s Albert. “That’s becoming more and more important in the customers’ decision-making processes. That’s where the things like Google BigQuery, Azure Synapse, AWS Redshift, Snowflake and all these other tools come into their own, to play off those data sets.”

Heckert suggests that the one of the real benefits of the cloud is its elasticity for activities and jobs that are more ad-hoc in nature. “You can spin up cloud-enabled compute platforms to consume that data as and when needed,” he says, adding, “another interesting aspect of cloud, particularly around the area of cloud-hosted databases, is the concept that not only can you access a provider’s data, you can access your own data or even another provider’s data, join those into one table, and make determinations in that way. That’s really intriguing because in the past it would have involved data warehousing and integrating the disparate datasets into some kind of framework. Now you can access the data and build applications around that data in a way that wasn’t possible before.”

“With the cloud, you don’t want to just lift and shift antiquated Jurassic applications,” adds Refinitv’s West. “If you’re leaving your database on premise for example, don’t move the application, because the ingress and egress costs will be too big. We try to educate our clients on how to migrate to the cloud, and help them all the way through the steps of those migrations, because we’ve already experienced the gotchas that some people might face when migrating on their own. Everyone’s journey is different, and everyone’s end state is different. So the challenge is trying to cater for all of those.”

“Cloud and zero footprint solutions have been a major focus for us,” says Bloomberg’s Albert. “Our customers’ use of cloud is constantly changing. They’re looking at hybrid models and multi-cloud models much more now. That technology evolution is going to continue to need investment, and it’s definitely an area where we are heavily investing, both through partnerships and with our own technology teams.”

One industry expert adds a final warning note, however. “Cloud gives a lot of flexibility. But if you start having agents connected left and right, and you don’t have a total view of the data, there’s always going to be a risk of unexpected usage, meaning that your cost savings are going to go down the drain. That’s the downside of the cloud, it’s more difficult to budget. As opposed to having large fixed costs for infrastructure, it’s a smaller fixed cost, but usage costs could unexpectedly go up.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Re-architecting the trading platform for interoperability, resilience and profitability

Trading platforms have come a long way since the days of exchanging paper certificates and shouting across trading floors, pits and desks in the early 2000s, but there is progress still to be made as firms strive to reduce risk, increase profitability, and make their mark in digital assets trading. This webinar will review the...

BLOG

BMLL Secures Investment and Partnership with Snowflake to Democratise Access to Level 3 Data

BMLL, the independent provider of harmonised, historical Level 3 data and analytics across global equity and futures markets, has announced a strategic investment from Snowflake Ventures. This comes on the heels of a $26 million Series B investment from Nasdaq Ventures, FactSet, and IQ Capital in Q4 2022. The new partnership with Snowflake aims to...

EVENT

TradingTech Summit London

Now in its 13th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Alternative Trading Systems Directory 2010

The year since we launched our first edition of the A-Team Alternative Trading Directory has passed by in a flash (no pun intended). And while the rate of expansion of the alternative trading system sector may have slowed – even consolidated somewhat – in the more established centres, their onward march continues both in terms of credibility, and of uptake...