About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Challenging the Status Quo: Re-imagining the Trading Desk for 2026 and Beyond

Subscribe to our newsletter

The opening session of A-Team Group’s recent TradingTech Summit Europe set a pragmatic tone for the discussions that followed. In a fireside chat between Stuart Lawrence, Head of EMEA Equity Trading at UBS Asset Management, and Monika Fernando, Product Leader, FinTech & Digital Platforms and former Head of Global FI Client Data & Analytics at TD Securities, the focus was not on abstract disruption, but on how buy-side trading desks are being re-architected in practice.

The message was clear: large parts of the buy-side workflow are already automated. The real shift now is towards embedding adaptive AI deeper into execution logic, and preparing market structure and governance frameworks for what follows.

Automation Is the Starting Point

One of the more striking themes was the scale of automation already in place on modern trading desks. A substantial majority of orders by number now flow through fully automated pipelines – from portfolio manager, through OMS and EMS, into broker algorithms and back – with minimal manual intervention and defined oversight.

That level of zero-touch processing reframes the industry conversation. Automation is no longer a differentiator; it is infrastructure. What comes next is more consequential. Early automation models relied on relatively static parameters, often applied uniformly across orders. Subsequent iterations introduced more nuanced clustering, grouping securities with similar characteristics and adjusting execution logic accordingly. The emerging phase moves further still: adaptive models capable of recalibrating in real time as market conditions shift.

This “intelligent touch” approach blends automation with AI-driven adaptation. Rather than relying solely on historical performance as a proxy for future outcomes, models can adjust broker allocation mid-order, respond dynamically to liquidity changes, and score execution performance continuously. The execution engine becomes less deterministic and more responsive. And that signals a deeper architectural change. AI is no longer confined to analytics dashboards; it is entering the execution decision loop itself.

AI as Workflow Interface

Another theme was the integration of AI directly into execution management systems. Initial deployments resemble enhanced natural language interfaces, allowing traders to query order books and analytics in real time. Over time, this expands to include broker data and predictive inputs, enabling more context-aware queries around expected auction participation or liquidity distribution.

The longer-term ambition is more ambitious still: AI acting as a consolidated intelligence layer, aggregating market commentary, broker insights and news flows into a condensed, prioritised view for traders. Here, however, lies a subtle but significant shift. When AI curates information – deciding what is surfaced and what is filtered out – it begins to influence behaviour indirectly. Even if final execution decisions remain human-led, the framing of information shapes judgement.

This raises governance questions that extend beyond traditional algorithm approval processes. As AI systems move from passive tools to active intermediaries, compliance, model validation and oversight frameworks will need to evolve accordingly. The balance between scepticism and over-reliance will become a structural consideration, particularly as newer generations of traders enter the workforce with different levels of comfort around automation.

Data: The Non-Negotiable Foundation

Adaptive execution models are only as reliable as the inputs that feed them. Real-time, clean, non-stale data is essential if systems are to recalibrate during the life of an order. The well-worn principle of “garbage in, garbage out” takes on renewed urgency when models are adjusting dynamically rather than following static rules.

Yet investment capacity remains constrained. A significant proportion of technology budgets across the industry is still absorbed by maintaining legacy systems, leaving a comparatively limited allocation for experimentation and modernisation. For firms seeking to industrialise AI within trading workflows, legacy infrastructure drag remains a structural obstacle.

This tension, between ambition for adaptive systems and the realities of inherited architectures, is likely to shape the pace of change more than technological feasibility alone.

Tokenisation and the 24/7 Debate

Beyond execution mechanics, the discussion turned to market structure. Tokenisation was framed not as a peripheral digital asset trend, but as a potential redesign of settlement and trading norms. Real-time settlement, reduced friction and continuous trading were presented as logical extensions of ledger-based infrastructure.

The idea of T+0 settlement, in particular, challenges the incrementalism of current reforms. If technological capability exists for near-instantaneous ownership transfer, why optimise around T+1 rather than leapfrogging further?

However, moving to 24/7 markets introduces complex operational questions. Liquidity provision outside core hours, reference pricing integrity, surveillance frameworks and staffing models all require reconsideration. Continuous markets may reduce friction in theory, but they also redistribute risk and responsibility in ways that are not yet fully resolved.

The transition, therefore, is not merely technical. It is economic and regulatory. Exchanges, brokers, asset managers and regulators would all need to adapt in concert.

Designing the Hybrid Desk

Perhaps the most enduring takeaway from the session was the emphasis on hybridisation. The future trading desk is neither fully automated nor purely discretionary. It is designed around collaboration between adaptive systems and informed human oversight.

AI literacy, rather than deep technical specialisation, emerges as a critical skill. Traders are not expected to become data scientists, but they must understand how models function, what signals they generate and when intervention is required. The risk is less about displacement by AI and more about obsolescence through disengagement.

What was clear from this opening session, is that the industry is not debating whether automation and AI will shape trading. That transformation is already underway. The strategic challenge now lies in integrating adaptive execution models, data governance, infrastructure investment and talent development into a coherent operating model.

The question facing trading technologists therefore, is not whether to build the hybrid desk, but how deliberately they choose to engineer it.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: From Data to Alpha: AI Strategies for Taming Unstructured Data

Date: 16 April 2026 Time: 9:00am ET / 2:00pm London / 3:00pm CET Duration: 50 minutes Unstructured data and text now accounts for the majority of information flowing through financial markets organisations, spanning research content, corporate disclosures, communications, alternative data, and internal documents. While AI has created new opportunities to extract signals, many firms are...

BLOG

McKay Brothers Establishes Low-Latency London-Singapore Connection

McKay Brothers, specialist provider of low-latency network services for trading and market data distribution, has activated a new private transport service between London and Singapore with a round-trip latency of less than 137 milliseconds, aimed principally at firms trading cryptocurrencies and FX. “We continually evaluate where our services can add the most value for clients...

EVENT

ExchangeTech Summit London

A-Team Group, organisers of the TradingTech Summits, are pleased to announce the inaugural ExchangeTech Summit London on May 14th 2026. This dedicated forum brings together operators of exchanges, alternative execution venues and digital asset platforms with the ecosystem of vendors driving the future of matching engines, surveillance and market access.

GUIDE

Hosted/Managed Services

The on-site data management model is broken. Resources have been squeezed to breaking point. The industry needs a new operating model if it is truly to do more with less. Can hosted/managed services provide the answer? Can the marketplace really create and maintain a utility-based approach to reference data management? And if so, how can...