About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

The New Shape of Market Data: Why Institutions Are Moving Toward a More Modular, Machine-Readable Architecture

Subscribe to our newsletter

For decades, the market-data ecosystem has been defined by reliance on a handful of dominant vendors. Their breadth, depth and entitlements frameworks became foundational to both the trading desk and the wider enterprise. But the requirements of the modern financial technology stack have shifted dramatically. Cloud-native development, agentic AI workflows, and a proliferation of analytics-driven functions across the front, middle and back office are reshaping not just what data organisations need, but how they need to access it.

Beneath the surface, the market is quietly undergoing a structural transition: from monolithic platforms to a more modular, API-led architecture. Newer providers – not competing in the traditional low-latency arena – are instead rethinking the delivery, standardisation and economics of machine-readable market and fundamentals data. Their rise signals a deeper change in how institutions expect data to work inside their systems.

One such provider, FMP (Financial Modeling Prep), offers a useful case study in what this next generation looks like.

Why Firms Are Looking Beyond the Big Players

The shift is not driven by a desire to upend incumbents, but by the realities of modern development cycles and the fragmentation of data-consuming teams across financial institutions.

As new AI groups, analytics functions and research pods emerge inside large organisations, the licensing assumptions of a previous era no longer map cleanly to how data is consumed today. Institutions are now building dozens of internal micro-teams that each need access to well-structured, machine-readable datasets. Many find they cannot justify enterprise-wide licences simply to allow an AI or innovation team to experiment.

Stuart Mooney, COO and Co-Founder of FMP, sees this trend playing out across both fintechs and large institutions.

“Fintechs and AI teams often need a broad set of data at a reasonable cost just to get started,” he tells TradingTech Insight. “The big vendors typically price them out or deprioritise them, so smaller teams come to us because they need clean data on day one.”

More broadly, he notes that a growing number of firms are reassessing their dependency on legacy delivery models.

“We’re seeing firms rethink how they access and consume market data. Much of that is driven by workflow demands – integration speed, flexible licencing, and developer experience – that the incumbents struggle to support quickly enough.”

The result is a widening lane for data providers whose strengths align with these new consumption patterns.

What Modern Data Providers Actually Do Differently

The emerging cohort of providers is defined less by content breadth and more by engineering philosophy: data should be delivered programmatically, cleanly, and in formats that suit modern tooling.

Mooney describes FMP’s approach: “We’re a disruptor in the sense that we offer fully programmatic, automated ways to access data. Many clients simply don’t want to work with the big vendors for these kinds of use cases.”

Machine-readable fundamentals derived from regulatory filings remain the core. But adjacent datasets – company-specific KPIs, revenue segmentation, geographic splits, earnings-call transcripts, and other disclosure-derived intelligence – are increasingly important. These sit between traditional fundamentals and the richer, more context-aware inputs required by today’s AI and analytics functions.

Delivery models are also evolving. “We support whatever delivery method the client needs – clean APIs, flexible integration, MCP,” says Mooney. “The goal is always the same: make accessing the data as easy as possible.”

This is particularly relevant as agentic AI systems gain mainstream traction. Such systems require well-structured feeds and low-friction ingest paths, making API-native providers a natural fit.

Where the Shift Is Most Visible

Nowhere is the changing market-data landscape more evident than inside AI, quant-research and data-science teams. Mooney sees these groups as early adopters of more modular architectures.

“Inside large organisations we see new internal teams – especially AI groups – that need accurate, but maybe not full coverage/depth, data for model development. They don’t want to pay enterprise-level licences just to experiment, so they look for alternative sources.”

Custom datasets are another growth area. Institutional users increasingly need niche, highly specific datasets that sit outside the priorities of large vendors.

“If a client needs a dataset that doesn’t exist, we can usually build it. For one customer, we sourced additional filings and data sources to build a complete net-worth dataset for members of Congress, delivered through a single API.”

Such examples highlight the growing appetite for precision data that is tightly aligned with specialised research questions or domain-specific investment strategies.

It also illustrates a broader point: the incumbents’ scale makes them indispensable for regulated feeds and low-latency trading, but far less suited to ad-hoc, domain-specific, time-sensitive data engineering.

The Emergence of a Modular Data Architecture

This new model doesn’t replace the incumbents’ role in the trading stack. Instead, it fills the gaps left by legacy platforms – the “excess friction” that slows down integration, experimentation, and cross-functional access.

Mooney is candid about where FMP does not compete: “We’re not in the ultra-low-latency race. If you need co-located raw feeds for trading, that’s not us. Our focus is real-time that’s primarily used for analytics, modelling and AI workflows.”

This positions newer providers not as wholesale alternatives, but as complementary building blocks in a broader data ecosystem. As the enterprise data fabric becomes more modular, API-native sources increasingly sit alongside traditional terminals and raw exchange feeds.

Accuracy, Validation and the Infrastructure Behind It

For any provider aiming to participate in institutional workflows, accuracy is non-negotiable. The more automated a firm’s analytics become, the less tolerance there is for downstream errors.

Mooney is unequivocal on the importance of quality: “Accuracy is the most important thing. It doesn’t matter how fast or cheap the data is – if it isn’t accurate, it’s useless. That’s why we invest heavily in our extraction and validation models.”

The rise of AI amplifies this requirement, he says. “As clients integrate data directly into automated models and AI agents, the tolerance for errors drops to zero. You have to get the accuracy right.”

But accuracy alone is not enough; infrastructure must be engineered to operate at institutional scale.

“We’ve built infrastructure that can handle millions of API calls per second without downtime. At some point the challenge stops being a finance problem and becomes a data-engineering problem.”

For institutions accustomed to decades of robust, industrial-strength data delivery, anything less is untenable.

A Roadmap Driven by the Market

Finally, the evolution of these providers is increasingly shaped by real-world demand rather than internal speculation.

Mooney explains how this plays out inside FMP: “Our roadmap is mostly customer-driven. If clients are willing to pay for a dataset, that’s a strong signal of value – especially when we hear the same request from multiple firms.”

Commercial structures are designed to support both exclusivity and scale depending on the use case. “If a dataset is truly proprietary and a client wants exclusivity, we’ll support that. If it has wider value, we structure the development so we can offer it more broadly. Flexibility in licencing is essential.”

This model allows innovation to flow directly from the demands of the market rather than internal product committees – a contrast with larger players whose roadmaps necessarily evolve more slowly.

A Market in Transition

The contours of the market-data landscape are changing. New AI-driven workflows, modular architecture patterns, and developer-centric expectations are creating pressures that legacy licensing models and delivery paradigms struggle to meet. The incumbents remain foundational, particularly for regulated and low-latency data – but they are no longer the only answer.

A new generation of technology-led data providers, exemplified by firms like FMP, is emerging to meet the demand for clean, flexible, rapidly integrable datasets. Their rise signals a shift toward a more composable ecosystem, where institutions assemble data architecture from multiple specialised components rather than a single monolithic source.

As financial markets continue their migration into cloud-native, AI-augmented operating models, the winners will be the firms that treat market data not as a static subscription, but as a dynamic, programmable input to their entire decision-making fabric.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and reduce costs. Use cases of specific importance to the finance sector, such as data...

BLOG

The ‘Always-On’ Gauntlet: How Converging Markets and 24/7 Trading are Forging the Next-Gen of Infrastructure

The concept of the ‘trading day’ is rapidly evolving. For generations, the rhythm of capital markets has been associated with the opening and closing bell, a predictable cycle that provided a crucial window for settlement, risk management, and system maintenance. But is that window now closing? A series of forward-looking discussions at A-Team Group’s recent...

EVENT

Eagle Alpha Alternative Data Conference, London, hosted by A-Team Group

Now in its 8th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

Solvency II Data Management Handbook

Want to get a handle on Solvency II and what it means for data management? Need to make sure you have all the bases covered for the looming January 2016 deadline? Our Solvency II Data Management Handbook is now available for free download to help you. This Handbook is the ultimate guide to all things...