About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Datawatch Adds Panopticon Streams Real-Time Stream Processing Engine

Subscribe to our newsletter

Datawatch has increased the speed of real-time streaming and time series data analytics with stream processing engine Panopticon Streams. The engine can be used as a stand-alone solution or in conjunction with Panopticon’s Visual Analytics platform.

Peter Simpson, vice president of visualisation strategy at Datawatch Panopticon, says: “Capital markets customers will benefit from Panopticon Streams’ support of several key use cases, including best execution, real-time P&L, transaction cost analysis and trader and trading surveillance.

“The addition of the engine’s capabilities means we now offer a general purpose streaming analytics platform. It has applications anywhere organisations need to identify anomalies and outliers, investigate their causes, back test potential solutions, and then alter their business processes to address the issue. Given the software’s ability to handle real-time and time series data, we believe it will be most useful in electronic trading, telecommunications, energy, and IoT applications.”

The combination of stream processing, rapid data comprehension through visual analysis, faster investigation through time series analysis and playback down to the individual tick, is designed to help organisations make timely, more informed decisions that have immediate financial impact.

Built on the Apache Kafka platform, Panopticon’s solutions enable business users to build sophisticated Kafka data flows with no coding. Users who understand the business problems can create their own data flows, which can use information from a number of sources and incorporate joins, aggregations, conflations, calculations, unions and mergers, and alerts. Analysts can visualise processed data using Panopticon Visual Analytics and deliver it to Kafka, Kx kdb+, InfluxDb, or any SqL database.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Role of Data Fabric and Data Mesh in Modern Trading Infrastructures

The demands on trading infrastructure are intensifying. Increasing data volumes, the necessity for real-time processing, and stringent regulatory requirements are exposing the limitations of legacy data architectures. In response, firms are re-evaluating their data strategies to improve agility, scalability, and governance. Two architectural models central to this conversation are Data Fabric and Data Mesh. This...

BLOG

ICE to Provide FX and Precious Metals Data to Chainlink Network

Intercontinental Exchange (ICE) has agreed to provide foreign exchange and precious metals data from its ICE Consolidated Feed to Chainlink, the infrastructure for tokenised assets. Under the new collaboration, ICE’s market data will be used as a contributing source for the derived data sets offered through Chainlink Data Streams. These streams are used by a...

EVENT

ExchangeTech Summit London

A-Team Group, organisers of the TradingTech Summits, are pleased to announce the inaugural ExchangeTech Summit London on May 14th 2026. This dedicated forum brings together operators of exchanges, alternative execution venues and digital asset platforms with the ecosystem of vendors driving the future of matching engines, surveillance and market access.

GUIDE

ESG Handbook 2023

The ESG Handbook 2023 edition is the essential guide to everything you need to know about ESG and how to manage requirements if you work in financial data and technology. Download your free copy to understand: What ESG Covers: The scope and definition of ESG Regulations: The evolution of global regulations, especially in the UK...