About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

DiffusionData Targets Agentic AI in Finance with New MCP Server

Subscribe to our newsletter

Data technology firm DiffusionData has released an open-source server designed to connect Large Language Models (LLMs) with real-time data streams, aiming to facilitate the development of Agentic AI in financial services. The new Diffusion MCP Server uses the Model Context Protocol (MCP), an open standard for AI models to interact with external tools and data sources.

The server enables AI assistants to interact with DiffusionData’s platform using natural language commands. This allows technical teams to perform operational and monitoring tasks – such as querying data streams or configuring system metrics – through conversational interfaces rather than writing code. The initiative is part of a wider industry trend exploring how autonomous agents can act on live data, moving beyond traditional analytics based on historical information.

Raphael Vergnaud, Chief Revenue Officer at DiffusionData, explains the company’s longer-term strategy to TradingTech Insight.  “MCP and our agents have initially been designed to interact with data streams in natural language, to  consume data, but our vision – especially in capital markets – is to evolve into intelligent participants. This means moving beyond simply piping streaming data to models and instead combining real-time data with LLMs to build intelligence directly into the flow. We see this happening in stages: first establishing the right framework and infrastructure for streaming data; then adding intelligence; and ultimately enabling a controlled degree of autonomy where agents can act on that intelligence. In financial services, this must be underpinned by safeguards – security, transparency, auditability, and control. We expect a rapid evolution through these steps, and what we are building now is the foundation for that progression.”

Standardising AI and Data Communication

A significant technical hurdle for deploying AI in trading environments is bridging the gap between the static nature of LLMs and the dynamic, low-latency requirements of market data. DiffusionData’s approach is to use the MCP Server as a standardised communication layer that decouples the AI model from the underlying data infrastructure. This architecture is intended to allow firms to adopt new LLMs as they become available without re-engineering their data pipelines.

“MCP serves as a standardised language between our technology and LLMs – a format they inherently understand,” notes Huw Rees, Engineering Technology Officer at DiffusionData. “It’s designed to be flexible, so it can work with new models from day one. This architecture allows LLMs to interpret the DiffusionData environment, request the data they need, and receive it in a transparent, granular form. That level of detail lets users build what they want – whether using standard tools or proprietary systems – as long as any specific nuances are communicated to the LLM.”

The company also highlights model latency as a key barrier to adoption in capital markets. Its design brings AI inference capabilities to the data stream, a method intended to reduce delays associated with moving large datasets to a separate AI environment for processing.

Architectural Approach and Security

DiffusionData states that its background in real-time data distribution informs its strategy for integrating AI. The MCP Server is built upon the company’s existing platform, which was designed for scalable and secure data streaming. As a result, the server incorporates features such as role-based access controls and auditable event streams to log and trace AI interactions.

“What sets us apart is that we begin with data streaming rather than the LLM – that’s a fundamentally different way of solving this problem for organisations,” observes Vergnaud. “The DiffusionData platform has provided this integration layer for years, supported by a gateway adapter framework that connects to external systems. This is simply another form of moving data from A to B, but it builds on that deep experience. And from the outset, everything has been designed with strict safeguards – per user, per feed, per topic – which is critical and built in by default.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: From Data to Alpha: AI Strategies for Taming Unstructured Data

Date: 16 April 2026 Time: 9:00am ET / 2:00pm London / 3:00pm CET Duration: 50 minutes Unstructured data now accounts for the majority of information flowing through financial markets organisations, spanning research content, corporate disclosures, communications, alternative data, and internal documents.  While AI has created new opportunities to extract signal from this data, many firms...

BLOG

Modern Data Platforms Empower Critical Use Cases: Webinar Preview

No longer is it enough for financial institutions to be simply “on top” of their data management architecture. They need to be constantly looking for the next innovation to keep them ahead of the game in this fast-moving space. That’s why modern data management platforms are the focus of so many organisations at the moment....

EVENT

TradingTech Summit New York

Our TradingTech Summit in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Regulation and Risk as Data Management Drivers

A-Team Group recently held a webinar on the topic of Regulation and Risk as Data Management Drivers. Fill in the form to get immediate access to the accompanying Special Report. Alongside death and taxes, perhaps the only other certainty in life is that regulation of the financial markets will increase in future years. How do...