
The most consequential changes facing financial markets technology in 2026 will not be driven by new asset classes or incremental latency gains, but by a fundamental rethinking of how trading systems are architected at their core.
For decades, market participants have organised technology around functional silos: execution, risk, middle office, post-trade. These boundaries were reinforced by market hours, batch processing, and asset-class separation. That operating model is now breaking down under the combined pressures of 24/7 markets, real-time risk management, faster settlement cycles, and growing cross-asset complexity.As Matt Barrett, CEO of Adaptive, observes, the debate around extended market hours has moved decisively from theory to practice.
“Market volatility, client demand and the tokenisation of seemingly everything have pushed the 24/7 markets debate from theory to genuine operating reality in 2025.”
With firms increasingly committing to 24/6 or 24/7 operating models, the architectural question is no longer whether legacy systems can be stretched further, but how they must be rebuilt to support continuous operation as a first-class requirement.
Why Siloed Architectures Are Becoming a Structural Risk
Siloed system estates introduce more than operational friction. They fragment state, break temporal consistency, and force firms to reconstruct the truth of a trade or position through reconciliation rather than design. In a world of overnight trading, intraday margin calls, and real-time capital consumption, these delays are no longer tolerable.
The challenge is not simply performance. It is coordination.
Modern trading workflows span execution, risk checks, allocation, confirmation, clearing, and settlement in increasingly tight timeframes. Architectures optimised for functional independence struggle to guarantee that lifecycle events remain strictly ordered, auditable, and recoverable when markets become volatile.
This is why many firms’ first response – decomposing monolithic systems into microservices – has delivered mixed results. While modularity improved, it also introduced a new class of problems around event ordering, state reconstruction, and operational visibility. In deterministic environments such as trading and post-trade processing, eventual consistency often proves insufficient.
The Rise of Sequencer-Led Architectures
Against this backdrop, sequencer-centric designs are emerging as a pragmatic response to the coordination problem at the heart of modern trading systems, says Barrett.
“The tech stack of the next decade will be fundamentally different from the one we see today, designed from the ground up for continuous operation; with rapid recovery and clean, provable audit trails – built-in, not bolted-on,” he predicts.
A sequencer introduces an explicit layer of orchestration into the heart of the trading architecture, ensuring that complex, multi-stage workflows progress in a controlled and predictable manner. By centralising responsibility for workflow progression, firms can impose consistency and determinism across otherwise distributed components, even as systems scale and evolve.
“In essence, a ‘sequencer’ acts as a traffic controller, keeping events in strict order across trading and post-trade, cutting manual breaks and speeding recoveries when markets are choppy,” explains Barrett.
This approach directly addresses the limitations of uncoordinated microservices by introducing determinism as an architectural principle. Event ordering becomes explicit. State transitions become provable. Recovery becomes a matter of replay rather than investigation.
Crucially, this does not imply a return to monolithic systems. Sequencers impose logical centralisation while remaining physically distributable, allowing firms to preserve low-latency deployment models while restoring architectural coherence.
Determinism, Auditability, and Recovery by Design
For many firms, the most immediate benefit of sequencer-led architectures is not speed, but control.
By separating business logic from infrastructure concerns, sequencers allow firms to evolve compute, storage, network, and acceleration layers independently of workflow intent. Clean event logs and deterministic replay support faster incident recovery and significantly simplify audit and regulatory reporting.
As Barrett notes, this architectural separation also improves scalability and team autonomy:
“An increasing number of market participants will favour event-driven and sequencer-centric designs over sprawling microservices for consistency and performance – while significantly improving scale by separating architectural concerns from differentiating business logic, letting teams work independently and deliver faster.”
For institutions grappling with real-time risk, continuous margining, and increasing regulatory scrutiny, these characteristics are rapidly becoming non-negotiable.
Enabling Buy-and-Build Technology Strategies
Sequencer-led cores also have strategic implications for technology procurement.
By establishing a stable internal contract for event sequencing and workflow progression, firms can increasingly buy undifferentiated foundations – infrastructure, connectivity, persistence – while building the workflows and client experiences that define competitive advantage.
“In short, this process will entail firms combining a ‘buy and build’ approach to their technology strategy,” observes Barrett. “Buy the undifferentiated foundations and build the execution, workflows and client experience that will stand your firm apart.”
This model aligns with the broader move toward modular platforms, open-source transparency, and cloud-based acceleration, while reframing vendor relationships around co-innovation rather than dependency.
Not an Architectural Fashion, but a Structural Necessity
Sequencer architectures are not without challenges. Concerns around migration complexity, perceived central points of failure, and cultural resistance are real. However, leading firms are already addressing these issues through phased adoption, replication strategies, and hybrid deployments alongside legacy systems.
The larger risk lies in inaction.
In a market environment defined by continuous trading, atomic workflows, and real-time accountability, architectures that cannot guarantee event order, determinism, and rapid recovery are increasingly incompatible with operating reality.
Ultimately, the value of this architectural shift lies not in theoretical purity, but in its ability to prepare firms for the next phase of market evolution.
Says Barrett: “The prospective payoff of this new breed of architecture is faster rollout, lower operational risk and a technology stack ready for AI, multi-asset, and true round-the-clock trading.”
For firms looking toward 2026, control of event flow is becoming synonymous with control of the business itself.
Subscribe to our newsletter



