For decades, many capital markets firms have relied on legacy market data platforms that were built before the internet and have since become increasingly data vendor specific and costly to maintain. Some of these platforms, while functional, are no longer actively developed in meaningful ways, creating significant risks as expertise fades and costs spiral out of control. Incumbent vendors have maintained control through complex commercial policies, making it difficult for firms to switch providers without substantial operational disruption. At the same time, the industry’s growing need for automation, scalability, and compliance—particularly around entitlements, authentication, and data lineage—is exposing the limitations of some of these outdated systems.
The need for a more modern, independent, and flexible approach to market data infrastructure – one that allows financial institutions to consume and distribute both external and internal pricing data efficiently, without being locked into proprietary data vendors – is clear. With new solutions emerging that provide seamless backward compatibility and a more agile approach to data distribution, has the financial markets sector now reached a turning point – one where firms can finally take back control of their technology and reduce dependency on legacy providers?
In this Q&A with TradingTech Insight, Terry Roche, CEO of next-generation data technology provider MarketsIO, explores the deep-rooted challenges financial institutions face with legacy market data platforms, rising operational costs, and the urgent need for more efficient, independent, and scalable solutions to support the evolving demands of capital markets.
TTI: Welcome, Terry. What factors have kept the industry so heavily reliant on these outdated systems? And how can firms be convinced that transitioning away from them is both manageable and worthwhile?
TR: There are a couple of key points to consider when we talk about platforms. First, it’s important to recognise that much of the platform technology in use today was established in the early to mid-1990s. The most widely used platforms in the industry were conceptualised around 1994 or 1995. These platforms have undergone iterations over the years—with millions of lines of code added—they’ve built a once-great technology. However, their ability to adapt and move forward is severely limited due to the complexity of their underlying codebase.
Historically, firms haven’t been overly concerned about this lock-in for several reasons. First, these platforms have, by and large, performed the functions they were designed to do. Second, the commercial arrangements with platform providers have typically been stable. But in recent years, we’ve seen a marked shift in the commercial stance of these providers. An approach narrowly focused on data sales has made it clear to many that a major data vendor serving as the distribution technology provider presents an inherent conflict of interest.
Many of our clients report being told they’re not on list price, resulting in significant cost increases— at times an order of magnitude higher. That’s certainly a wake-up call.
Fundamentally, the long period of stasis in this area comes down to two key factors. First, there’s never been a credible, fully featured, and capable alternative until now. Second, there’s never been an alternative that offers backward compatibility with clients’ existing systems. Added to this, APIs provided by market data vendors haven’t been easy to work with, which has fuelled the perception that migration would be a multi-year, resource-intensive project.
However, at MarketsIO, our EventStream platform addresses these issues. Because key vendors have open-sourced their APIs, we’ve been able to provide backward compatibility that significantly reduces migration complexity. Clients can trial our platform and see their existing data feeds, applications, and desktops work out of the box.
This shifts the risk equation. Firms are faced with a choice: remain on a 30-year-old, platform whose commercial terms seem to change every year, or move to a modern technology platform. Our platform is about a tenth the size of the legacy codebase, significantly less expensive, and designed with ongoing investment and development in mind.
At this point, many clients have told us that the greater risk is staying on their outdated, legacy platform.
TTI: Regarding key vendors open-sourcing their APIs, how much of a pre-requisite is that in order to achieve this backward compatibility with existing systems?
TR: The most widely used platform in the industry has already open-sourced its APIs. So that addresses much of the challenge. For the legacy APIs that haven’t been open-sourced, MarketsIO has developed the multi-platform, multi-language Fusion API suite, which was built by our front-office technologists. It’s highly user-friendly and makes it straightforward to migrate older APIs. Added to this, we provide the source code of our API suite to data consumers. Therefore, they are not locked into MarketsIO either. All should be empowered to make decisions that are best for them and not be controlled by a lock-in.
Clients who want to switch to our APIs can typically be migrated in about a couple of days to a couple of weeks per application. This disproves the perception that such migrations are multi-year, high-risk projects. We’ve already demonstrated—through real-world implementations—that that’s no longer the case. Clients can move to a modern platform with superior fault tolerance down to the instrument level and realise 15–20% better performance in terms of latency and scalability, all at a much lower cost. The risk has effectively been removed from the equation.
We also provide backward compatibility with legacy entitlement systems, allowing clients to transition their entitlement logs to our platform seamlessly. This fundamentally de-risks the migration process. It’s easy to cut over your platform, manage entitlements, and then move forward with a modern, scalable solution. At that point, the perception of complexity and risk just falls away.
TTI: You’ve mentioned costs and the growing trend of incumbent vendors moving their clients to ‘list price,’ which often seems like a vague and inconsistent concept, as they frequently apply different pricing for different customers and use cases, with little transparency. The FCA’s Wholesale Data Market Study last year highlighted these issues but didn’t propose any solutions. What, in your view, is driving these escalating costs, and why are they rising so sharply?
TR: Our clients tell us they believe it’s simply a mechanism for the vendors to capture additional revenue. Many are also concerned about the lack of meaningful development on these platforms. Clients see substantial price increases tied to what the providers claim is the ‘list price,’ but the question becomes in the mind of the client as to whether these increases are justified or are merely an attempt to extract more revenue from a saturated market. Based on what we’re hearing, it’s clear many clients feel the latter is true. If prices are rising disproportionately to inflation without corresponding material improvements to the technology, then something is amiss—it’s not about delivering better solutions.
Our mission at MarketsIO is to change this dynamic. As an independent technology provider, we aim to offer choice. We believe this technology should be commoditised, and data distribution should be utilitised. There’s no question that market data vendors provide significant value—they have for decades—and they should be compensated for that. But the technology that delivers data should be commoditised.
We’re committed to providing independent technology that decouples data sourcing from data technology. This independence ensures clients can manage their data effectively without conflicts of interest. There’s an inherent conflict when a technology provider’s primary business is selling data. It raises legitimate concerns about whether that technology truly empowers clients to minimise costs, avoid audits, and optimise data management.
As an independent provider, our mission is to deliver the most advanced and efficient technologies for this purpose. For example, our platform supports granular unit of count, authentication, IP whitelisting, and other controls to empower data consumers. These features help eliminate audit risks by providing transparent, authenticated data usage.
We also address non-real-time data management, which historically lacked controls but has exploded in cost. By offering an enterprise suite of technology that spans real-time and static data, we enable the industry to better control these costs. Commercial policies have shifted from enterprise-wide licensing to per-desk fees, often without clients having the tools to adapt. We’re solving this for the betterment of the industry.
Our approach is to never own data or take a revenue share from anyone’s data. We avoid conflicts with information originators, who can benefit just as much from our technology as consumers. Being independent allows us to serve all parties—information creators, vendors, and consumers—equally.
TTI: What do you see as some of the most critical risks for firms that continue to rely on legacy market data technology?
TR: They need to ask themselves some fundamental questions. Is their technology being actively developed in ways that create material enhancements to the service? Are there dedicated development teams making significant investments in the platform?
One key risk we hear from clients is the potential loss of institutional knowledge as experienced team members retire. Without active development and a steady flow of talent, there’s a real danger that the technology could become obsolete.
Another major shift to consider is the move towards the cloud. The cloud offers tremendous benefits in terms of scalability and efficiency for certain use cases, but it’s not inherently built for real-time workflows. Representations of ultra-low latency in the cloud often require bypassing cloud-native virtual services, which undermines the cloud’s core advantages.
To be clear, we believe the cloud is an excellent solution for many services and use cases, but not for everything. The industry needs a combination of delivery models: on-premises, proximity hosting, cloud delivery, and managed services. Clients are increasingly concerned about whether their existing technology providers can adequately support all these use cases moving forward.
The risk isn’t just technological obsolescence—it’s about whether these legacy systems can evolve to meet the complex and varied demands of capital markets.
TTI: Looking at other industries that have faced similar challenges with legacy lock-in but managed to overcome them, what lessons can the financial markets sector learn from those experiences?
TR: Capital markets and financial services are unique in their real-time demands, coupled with high availability and stringent controls over how data is distributed, whether internally or externally, but there are parallels we can draw from other industries. For example, healthcare and life sciences face challenges around data authentication and entitlement controls, particularly for highly sensitive information. In all cases strict regulations ensure their identity remains protected through sophisticated anonymisation. Anonymity controls in life sciences are essential, and we believe those requirements will only grow with advancements in AI. The need for robust entitlement, anonymisation, and encryption capabilities to manage the increasing flow of information securely, has parallels with capital markets.
Another emerging example is self-driving cars. The sheer volume of real-time data required to make autonomous vehicles a reality is staggering. This involves complex enterprise capabilities for monitoring and navigation, compounded by proximity issues—how vehicles interact with local compute resources versus centralised systems for traffic and other functions. It’s a distributed compute challenge, and while it differs from capital markets, it illustrates the need for scalable, efficient, and flexible infrastructure.
At MarketsIO, we’ve designed our technology to be generic and adaptable, and we’ve focused on capital markets because that’s where our expertise lies. Our primary focus is transforming financial services, which have thrived for decades in their current model. It’s clear though that now is the time to move to a more commoditised, standardised, and open approach—one that benefits everyone in the ecosystem with modern technology, actively developed, and supported by fair, consistent commercial policies.
This transformation will provide value not just to data originators but also to consumers, driving the next generation of financial services.
TTI: Final thoughts?
TR: The key message is that capital markets have been locked into outdated, legacy technology that is now driving up costs significantly and creating risks for the future of technology delivery. The platforms dominating the market today were created by brilliant minds three or four decades ago. And those systems served their purpose, but they’re now outdated. Clients express to us concerns that the people who built them are nearing retirement and the future of the technology is in question. Meanwhile, costs are skyrocketing without delivering commensurate benefits.
What’s needed is a shift to open, commoditised, and utility-like infrastructure provided by an independent technology provider with no conflicts of interest with data originators or consumers.
The market needs to embrace independence to create real choice. And with choice comes more effective commercial models and more dynamic, responsive technology delivery.
TTI: Thank you, Terry.
Subscribe to our newsletter