About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

DiffusionData Targets Agentic AI in Finance with New MCP Server

Subscribe to our newsletter

Data technology firm DiffusionData has released an open-source server designed to connect Large Language Models (LLMs) with real-time data streams, aiming to facilitate the development of Agentic AI in financial services. The new Diffusion MCP Server uses the Model Context Protocol (MCP), an open standard for AI models to interact with external tools and data sources.

The server enables AI assistants to interact with DiffusionData’s platform using natural language commands. This allows technical teams to perform operational and monitoring tasks – such as querying data streams or configuring system metrics – through conversational interfaces rather than writing code. The initiative is part of a wider industry trend exploring how autonomous agents can act on live data, moving beyond traditional analytics based on historical information.

Raphael Vergnaud, Chief Revenue Officer at DiffusionData, explains the company’s longer-term strategy to TradingTech Insight.  “MCP and our agents have initially been designed to interact with data streams in natural language, to  consume data, but our vision – especially in capital markets – is to evolve into intelligent participants. This means moving beyond simply piping streaming data to models and instead combining real-time data with LLMs to build intelligence directly into the flow. We see this happening in stages: first establishing the right framework and infrastructure for streaming data; then adding intelligence; and ultimately enabling a controlled degree of autonomy where agents can act on that intelligence. In financial services, this must be underpinned by safeguards – security, transparency, auditability, and control. We expect a rapid evolution through these steps, and what we are building now is the foundation for that progression.”

Standardising AI and Data Communication

A significant technical hurdle for deploying AI in trading environments is bridging the gap between the static nature of LLMs and the dynamic, low-latency requirements of market data. DiffusionData’s approach is to use the MCP Server as a standardised communication layer that decouples the AI model from the underlying data infrastructure. This architecture is intended to allow firms to adopt new LLMs as they become available without re-engineering their data pipelines.

“MCP serves as a standardised language between our technology and LLMs – a format they inherently understand,” notes Huw Rees, Engineering Technology Officer at DiffusionData. “It’s designed to be flexible, so it can work with new models from day one. This architecture allows LLMs to interpret the DiffusionData environment, request the data they need, and receive it in a transparent, granular form. That level of detail lets users build what they want – whether using standard tools or proprietary systems – as long as any specific nuances are communicated to the LLM.”

The company also highlights model latency as a key barrier to adoption in capital markets. Its design brings AI inference capabilities to the data stream, a method intended to reduce delays associated with moving large datasets to a separate AI environment for processing.

Architectural Approach and Security

DiffusionData states that its background in real-time data distribution informs its strategy for integrating AI. The MCP Server is built upon the company’s existing platform, which was designed for scalable and secure data streaming. As a result, the server incorporates features such as role-based access controls and auditable event streams to log and trace AI interactions.

“What sets us apart is that we begin with data streaming rather than the LLM – that’s a fundamentally different way of solving this problem for organisations,” observes Vergnaud. “The DiffusionData platform has provided this integration layer for years, supported by a gateway adapter framework that connects to external systems. This is simply another form of moving data from A to B, but it builds on that deep experience. And from the outset, everything has been designed with strict safeguards – per user, per feed, per topic – which is critical and built in by default.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

Bigger is Better, Says Gresham CEO After Acquisition of S&P Global’s EDM Business

Gresham has finalised its acquisition of S&P Global’s EDM business as the data automation company expands to meet the growing and increasingly complex data needs of modern financial institutions. EDM, which supports more than US$12 trillion in assets, will sit alongside Gresham’s existing enterprise data management business, which was created with its merger with Alveo...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

MiFID II handbook, third edition – How compliant are you?

Six months after Markets in Financial Instruments Directive II (MiFID II) went live, how compliant is your organisation? If you took a tactical approach to cross the compliance line on January 3, 2018, how are you reviewing and renewing systems to take a more strategic approach and what are the business benefits of doing so?...