About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Building for a Market That Never Sleeps: Inside LSEG’s Real-Time Data Platform Optimisation

Subscribe to our newsletter

As data volumes surge and markets move toward 24/7 operation, LSEG is rebuilding the infrastructure that underpins its real-time data services – combining hyperscaler capabilities with private-cloud control to deliver scale, speed and resilience for the next generation of capital markets workflows.

The forces reshaping real-time data

The demands on real-time market data infrastructure are intensifying on multiple fronts. LSEG’s real-time network now distributes over 2.2 trillion ticks on its busiest trading days, drawn from almost 600 exchanges and trading venues worldwide and covering over 90 million instruments. The shift toward continuous trading hours is compressing the time available for reconciliation and maintenance. At the same time, the proliferation of AI and machine learning workflows across front, middle and back offices is creating entirely new consumption patterns, with firms increasingly requiring streaming data – at a significant scale – not just for human traders, but as direct inputs to automated analytical and decisioning processes.

Meanwhile, the legacy architectures that many firms have relied on for decades – including data providers themselves – are reaching their limits. The urgency is real: with markets moving toward round-the-clock operation and innovation accelerating across the stack, the window for modernisation is narrowing fast.

A hybrid approach to LSEG’s Real-Time platform optimisation

In January 2026, LSEG announced a collaboration with Amazon Web Services to support the optimisation of its real-time data services. Under the programme, LSEG is leveraging AWS capabilities to support the collection, routing and distribution of its Full Tick and Real-Time – Optimized data feeds, the backbone of its Data & Analytics business.

The architectural approach is distinctive. Rather than adopting a purely public-cloud distribution model, LSEG is combining hyperscaler capabilities with its own private-cloud environment, thus preserving the operational control and resilience that capital markets participants expect, while gaining the elastic scaling needed to handle peak data loads. This approach required LSEG and AWS to co-engineer an ultra-low-latency edition of AWS Outposts, tailored for streaming huge volumes of market data. The programme forms part of LSEG’s multi-cloud strategy and is complementary to existing cloud partnerships, including its strategic relationship with Microsoft, and extends the group’s infrastructure options for a specific, high-throughput workload.

This hybrid model reflects a growing consensus across capital markets that the path to cloud is not a binary choice between on-premise and public cloud. For real-time data distribution, where latency sensitivity and operational resilience are non-negotiable, the ability to retain private-cloud control while tapping into hyperscaler elasticity addresses a genuine architectural tension.

Cloud-readiness, flexibility, interoperability and speed

LSEG’s real-time platform optimisation is oriented around four pillars:

Cloud-readiness addresses the reality that buy-side and sell-side firms increasingly need their data providers to meet them in cloud and hybrid environments. LSEG already offers multiple data feeds to accommodate clients’ differing latency requirements – Full Tick, Optimized, Delayed Optimized and Direct – across on-premise, cloud and hybrid deployments with API connectivity. The collaboration with AWS will deepen these cloud-native delivery capabilities to scale in step with customer demand.

Flexibility speaks to the diversity of consumption patterns in modern capital markets. Firms must match their consuming pattern to the pace of their business, whether that means full real-time streaming processing millions of messages per second, a conflated feed, or a daily bulk operation. Attempting to keep pace with a full-tick stream when the business doesn’t require it introduces unnecessary engineering fragility and costly processing overheads.

“You need to look at your business and the pace of that business,” says Patrik Färnlöf, Group Head of Real-Time Engineering at LSEG, speaking at a recent A-Team Group webinar on data platform modernisation. “You must review your business flows and pick the right consuming pattern for the data. Going real-time presents interesting challenges – you need to keep up with the data, engineer proper resiliency, and handle complex real-time reconciliations.”

Interoperability has become critical as firms operate increasingly heterogeneous technology estates. LSEG’s proprietary and widely adopted symbology and data models simplify integration across platforms, but the broader industry challenge is enabling data to flow across tools and business units without creating silos. Open standards and emerging interoperability layers are potentially transformative for how firms consume and contextualise real-time data.

Finally, speed is not solely about latency in the traditional sense. It encompasses the pace at which new data sources can be onboarded, new delivery channels activated and new analytical workloads supported. In a market where AI-driven strategies demand ever-richer real-time inputs, the speed of platform evolution matters as much as the speed of individual data updates.

Governance, quality and the operational dimension

It’s increasingly clear that the hardest problems in data platform modernisation are no longer technical. Organisational structures, data governance frameworks and quality assurance processes are the factors that most often determine whether modernisation succeeds or fails. And this applies to data providers as much as to consumers.

LSEG manages the full lifecycle of its real-time data: designing symbology and data models, maintaining exchange relationships, ensuring compliance, and operating a global infrastructure with ultra-high network availability through mirrored Points of Presence worldwide. The operational engine behind the data – 24/7 global support, management of hundreds of venue-driven changes annually, centralised entitlement administration and third-party rights management – is as much a part of the value proposition as the data itself. Re-architecting this layer to meet future demand requires more than a technology swap; it requires embedding policy-driven quality checks and operational accountability directly into the new architecture.

Building for what comes next

LSEG’s real-time data platform optimisation sits within a broader industry trajectory. Genuine transformation means rethinking how data is owned, governed, delivered and consumed, not simply re-hosting legacy systems in the cloud. For a provider of LSEG’s scale, distributing real-time data from hundreds of venues to a global client base spanning mission-critical workflows such as trading desks, risk functions, compliance teams and AI-driven analytical workflows, getting this right is foundational.

The hybrid architecture LSEG has chosen – retaining private-cloud control while leveraging AWS for elastic scaling – may well become a template for how other infrastructure-critical data services approach cloud transformation. It acknowledges that for workloads where resilience and control are paramount, the cloud is an enabler rather than a destination.

As Färnlöf told the A-Team Group webinar, “If you do not modernise now amidst rapid innovation and 24/7 velocity, you are going to face a very difficult business scenario in three years. The financial markets are moving extremely fast.”

And if the data platform cannot deliver data fast enough for critical business decisions, the transformation has failed on the only metric that matters: time to value.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: From Data to Alpha: AI Strategies for Taming Unstructured Data

Unstructured data and text now accounts for the majority of information flowing through financial markets organisations, spanning research content, corporate disclosures, communications, alternative data, and internal documents. While AI has created new opportunities to extract signals, many firms are discovering that value is constrained not by models, but by the quality of the content, architecture,...

BLOG

BCG Expand: Market Data Industry Tops $50bn as Growth Normalises and Cost Discipline Tightens

Global market data industry revenues surpassed $50bn for the first time in 2025, reaching $50.5bn, according to BCG Expand’s latest Market Data Market Sizing report. Total revenues grew 6.4% in 2025, down from 6.6% in 2024 and 8.3% in 2023, signalling a moderation after several years of stronger expansion. The slowdown, however, does not point...

EVENT

TEST Event page 1

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

The Data Management Implications of Solvency II

Bombarded by a barrage of incoming regulations, data managers in Europe are looking for the ‘golden copy’ of regulatory requirements: the compliance solution that will give them most bang for the buck in meeting the demands of the rest of the regulations they are faced with. Solvency II may come close as this ‘golden regulation’:...