About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

The Future of Market Data: Cloud and AI’s Transformative Impact on Capital Markets

Subscribe to our newsletter

Market data is the lifeblood of trading, but soaring data volumes and rising real-time demands are straining traditional methods of distribution and consumption. In response, financial institutions are turning to cloud technology and AI-driven solutions to modernise their data infrastructure, enabling greater scalability, improved efficiency, and deeper insight from their data assets.

A recent webinar, “The future of market data: harnessing cloud and AI for market data distribution and consumption,” hosted by A-Team Group and featuring representatives from Jefferies, OneTick, and LSEG Data & Analytics, provided valuable insights into how these technologies are reshaping market data strategies.

The Cloud Imperative: Shifting Data Operating Models

The financial industry is witnessing a significant strategic shift in its approach to market data infrastructure. Firms are increasingly re-evaluating their data operating models, moving away from traditional on-premise solutions towards outsourcing commoditised tasks such as data collection, storage, and tick history management. This allows them to concentrate on critical factors like cost efficiency, latency optimisation, and the flexibility of analytics.

LSEG has embraced this shift, housing approximately 75 petabytes of market data across major cloud providers, including Microsoft, Amazon, and Google. This vast repository of data is shared directly into customers’ cloud instances, facilitating on-demand and instantaneous data availability. Such immediate access is crucial, especially as markets accelerate towards faster trading cycles, exemplified by T+1 settlement and extended trading hours. The economic advantages are compelling; one analysis suggests that cloud storage for market data costs around $4 per gigabyte, a stark contrast to approximately $110 per gigabyte in a traditional data centre. This substantial saving not only reduces the Total Cost of Ownership (TCO) but also frees up capital for more extensive data procurement.

However, the migration to cloud environments is not without its hurdles. Participants in the discussion highlighted that the migration process itself can be lengthy, thereby delaying the realisation of anticipated cost savings. Effective data management, including comprehensive cataloguing and understanding of vast datasets, is essential for data to be truly useful once onboarded. Furthermore, many firms operate in a hybrid mode, blending on-premise and cloud infrastructure, which can introduce complexities and costly data egress between environments. A critical lesson learned is that a simple “lift and shift” of existing on-premise implementations to the cloud without a fundamental re-evaluation of objectives often results in slower, more expensive, and less efficient setups. Despite these challenges, the cloud offers undeniable benefits for resilience, providing virtually unlimited storage and elasticity, which greatly simplifies data retention for compliance (e.g., retaining data for six to seven years) compared to outdated archiving methods.

AI’s Transformative Role in Market Data Operations

Artificial intelligence is rapidly becoming an indispensable tool in market data operations, enhancing efficiency and unlocking new capabilities. Its core applications are diverse and impactful. Large Language Models (LLMs), now a key component of AI, are transforming data access by enabling users, particularly front-office personnel, to query data using natural language rather than proprietary APIs. This intuitive approach significantly broadens data accessibility.

AI excels at data integration and summarisation, proficiently bringing together disparate datasets, performing complex calculations, and summarising intricate information. This capability dramatically speeds up processes that previously demanded multiple human inputs and sources. Beyond analysis, AI automates mundane tasks and assists research, for instance, by summarising group chats or market announcements. Given the growing reliance on machine learning for business insights, AI is also fundamental to ensuring high data quality and integrity. Moreover, it facilitates dynamic “what if” scenarios, allowing intelligent responses to questions about trading strategies or market impacts, such as inflation scenarios.

For AI to be truly effective, data operationalisation is paramount. Data must be “cured” or well-structured and normalised, and accompanied by rich metadata like user guides, so that LLMs can accurately interpret field definitions. Techniques such as chunking, embedding, and clustering are employed to direct the AI to the most relevant data. The impact of AI also extends to regulatory compliance, assisting internal and external teams in faster trade surveillance and identifying anomalies that were previously difficult to pinpoint with prescriptive queries. The primary drivers for AI adoption are clear: improved speed and efficiency, alongside competitive pressure and innovation strategy. While direct cost reduction may not be the initial primary driver, the indirect cost savings realised through time efficiencies are substantial.

Latency, Location, and Data Consistency

The interplay between latency requirements and data location remains a critical consideration. For ultra-low latency applications, particularly in High-Frequency Trading (HFT), co-location with exchanges generally remains necessary. However, cloud environments are increasingly proving viable for less latency-dependent trading, measured in milliseconds as opposed to microseconds.

Firms like Jefferies are adopting hybrid models to optimise for different latency needs. This involves leveraging co-located infrastructure for highly latency-sensitive trading, while utilising the cloud for reference, static, or large historical datasets where latency is less critical. Cloud providers are also advancing their capabilities, streaming full tick real-time data directly into time-series databases, enabling customers to query and execute trades based on these real-time streams. AI can also be applied within this real-time streaming context. A significant challenge in these hybrid setups, however, is ensuring data consistency, especially if different providers or systems are employed for real-time and historical data. Inconsistencies in symbologies or trade conditions can severely hamper analytics, underscoring the importance of designing systems where the same analytical frameworks can operate seamlessly across both real-time and historical data.

The Primacy of Data Quality and Governance

As data becomes more centralised and accessible, striking a balance between ease of access and robust governance is essential. This includes diligently maintaining contractual restraints, client privacy, and corporate information privileges. Entitlement management is mandatory at every stage of the data lifecycle, particularly when migrating data to new platforms, to prevent any lapse in entitlement checks. Centralised reporting of usage is also vital for accurate entitlement and effective usage tracking.

For audit purposes, storing raw data is crucial. Alongside this, creating enriched or consumable versions tailored to specific use cases is also important. A key lesson reinforced by participants in the discussion is the critical importance of early planning for data standardisation, normalisation, cataloguing, and centralisation to pre-empt future complexities. Furthermore, thoroughly defining analytical requirements and understanding the necessary compute environment (e.g., for data joins) before commencing any migration is paramount.

Economic Models and Future Trajectory

The market data landscape is also experiencing a shift in economic models, with a clear trend towards usage-based pricing. Under this model, costs are typically split between data access, which can be configured by market, symbols, depth, and time period, and compute, which defines CPU capacity and elastic compute resources.

Looking ahead, the future of market data promises several key developments. Experts predict a significant increase in data providers, potentially leading to a competitive environment that could drive down data costs. Vendors are expected to move towards providing more nuanced insights and “smarter data”. Concurrently, there will be a heightened emphasis on Service Level Agreements (SLAs) pertaining to data quality and accuracy.

AI is poised to fundamentally redefine the market data landscape within the next two to five years, necessitating that all data be “AI-ready,” meaning structured for LLMs. The rapid pace of change in AI is noteworthy, with daily updates from providers constantly reshaping capabilities. The future will also see a marked increase in cloud usage and a corresponding decrease in on-premise infrastructure. Further standardisation of data formats and greater reliance on natural language processing for data access will simplify consumption.

Finally, documentation is expected to evolve into the foundational learning material for LLMs, with continuous monitoring of user queries serving to refine both documentation and AI responses. The convergence of cloud and AI is not merely an incremental improvement but a fundamental re-architecture of how market data is consumed and distributed, promising a more efficient, insightful, and resilient financial ecosystem.

This webinar was sponsored by LSEG Data & Analytics and OneTick.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Enhancing trader efficiency with interoperability – Innovative solutions for automated and streamlined trader desktop and workflows

17 September 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Traders today are expected to navigate increasingly complex markets using workflows that often lag behind the pace of change. Disconnected systems, manual processes, and fragmented user experiences create hidden inefficiencies that directly impact performance and risk management. Firms that can streamline...

BLOG

QuantHouse Integrates Saudi Exchange Market Data

QuantHouse, the API Data and Trading Solutions business of Iress, has expanded its offering to provide real-time market data from the Saudi Exchange, enabling clients to access comprehensive trading information from the largest stock exchange in the Middle East and North Africa (MENA) region. The integration makes real-time data on equities, trusts, rights, indices, and...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...