About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

DiffusionData Targets Agentic AI in Finance with New MCP Server

Subscribe to our newsletter

Data technology firm DiffusionData has released an open-source server designed to connect Large Language Models (LLMs) with real-time data streams, aiming to facilitate the development of Agentic AI in financial services. The new Diffusion MCP Server uses the Model Context Protocol (MCP), an open standard for AI models to interact with external tools and data sources.

The server enables AI assistants to interact with DiffusionData’s platform using natural language commands. This allows technical teams to perform operational and monitoring tasks – such as querying data streams or configuring system metrics – through conversational interfaces rather than writing code. The initiative is part of a wider industry trend exploring how autonomous agents can act on live data, moving beyond traditional analytics based on historical information.

Raphael Vergnaud, Chief Revenue Officer at DiffusionData, explains the company’s longer-term strategy to TradingTech Insight.  “MCP and our agents have initially been designed to interact with data streams in natural language, to  consume data, but our vision – especially in capital markets – is to evolve into intelligent participants. This means moving beyond simply piping streaming data to models and instead combining real-time data with LLMs to build intelligence directly into the flow. We see this happening in stages: first establishing the right framework and infrastructure for streaming data; then adding intelligence; and ultimately enabling a controlled degree of autonomy where agents can act on that intelligence. In financial services, this must be underpinned by safeguards – security, transparency, auditability, and control. We expect a rapid evolution through these steps, and what we are building now is the foundation for that progression.”

Standardising AI and Data Communication

A significant technical hurdle for deploying AI in trading environments is bridging the gap between the static nature of LLMs and the dynamic, low-latency requirements of market data. DiffusionData’s approach is to use the MCP Server as a standardised communication layer that decouples the AI model from the underlying data infrastructure. This architecture is intended to allow firms to adopt new LLMs as they become available without re-engineering their data pipelines.

“MCP serves as a standardised language between our technology and LLMs – a format they inherently understand,” notes Huw Rees, Engineering Technology Officer at DiffusionData. “It’s designed to be flexible, so it can work with new models from day one. This architecture allows LLMs to interpret the DiffusionData environment, request the data they need, and receive it in a transparent, granular form. That level of detail lets users build what they want – whether using standard tools or proprietary systems – as long as any specific nuances are communicated to the LLM.”

The company also highlights model latency as a key barrier to adoption in capital markets. Its design brings AI inference capabilities to the data stream, a method intended to reduce delays associated with moving large datasets to a separate AI environment for processing.

Architectural Approach and Security

DiffusionData states that its background in real-time data distribution informs its strategy for integrating AI. The MCP Server is built upon the company’s existing platform, which was designed for scalable and secure data streaming. As a result, the server incorporates features such as role-based access controls and auditable event streams to log and trace AI interactions.

“What sets us apart is that we begin with data streaming rather than the LLM – that’s a fundamentally different way of solving this problem for organisations,” observes Vergnaud. “The DiffusionData platform has provided this integration layer for years, supported by a gateway adapter framework that connects to external systems. This is simply another form of moving data from A to B, but it builds on that deep experience. And from the outset, everything has been designed with strict safeguards – per user, per feed, per topic – which is critical and built in by default.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Enhancing trader efficiency with interoperability – Innovative solutions for automated and streamlined trader desktop and workflows

Traders today are expected to navigate increasingly complex markets using workflows that often lag behind the pace of change. Disconnected systems, manual processes, and fragmented user experiences create hidden inefficiencies that directly impact performance and risk management. Firms that can streamline and modernise the trader desktop are gaining a tangible edge – both in speed...

BLOG

BMLL Set for “Supercharged” Growth Following Nordic Capital Acquisition

Nordic Capital has announced its acquisition of BMLL, the Level 3 historical market data and analytics provider. The investment, made in partnership with BMLL’s management team and minority shareholder Optiver, is set to accelerate the company’s growth and expand its global footprint. While the financial terms of the deal have not been officially disclosed, industry...

EVENT

AI in Capital Markets Summit London

Now in its 2nd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

GDPR Handbook

The May 25, 2018 compliance deadline of General Data Protection Regulation (GDPR) is approaching fast, requiring financial institutions to understand what personal data they hold, why they process it, and whether it is shared with other organisations. In line with individuals’ rights under the regulation, they must also provide access to individuals’ personal data and...