About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Intelligent Trading Summit: Making the Most of Big Data

Subscribe to our newsletter

Big Data is growing in both volume and adoption, but it needs to be clean, synchronised, analysed and used sensibly in tackling business problems if it is to reach its full potential in the trading environment.

Discussing Big Data for algo development and analytics at last week’s A-Team Group Intelligent Trading Summit, Peter Farley, director at A-Team Group, questioned an expert panel about the development of Big Data, what it comprises and how it is being used. Setting the scene, Roman Chwyl, worldwide financial services and insurance sales leader, IBM Platform Computing, said: “To make money, traders need to take on more risk and comply with more regulations, which means more computing. Today, big banks are taking on more data and processing, and they are leveraging Big Data tools.”

Nick Idelson, technical director at TraderServe, a provider of real-time trading applications, added: “Big Data is the underlying real-time data used by financial institutions. It grows all the time, but it is important to understand the quality of the data and whether it is contemporaneous. Banks are tackling the issues of Big Data and pulling together vast amounts of structured and unstructured data, including news information. They need to get it all into a system, process it and then use it, perhaps for automated trading or decision support, but they must be sure it is clean, synchronised and used sensibly.”

Picking up on this point, Peter Simpson, senior vice president of research and development at Datawatch, a provider of data visualisation software, said: “Successful implementations of Big Data are around business problems and consider where the data is coming from, how clean it is, how it can be analysed and the required business outcomes. Projects should focus on end business requirements and then work back to the technology stack.”

Turning to the issue of social media as a contributor to Big Data, Idelson explained: “The need is to take into account how low the signal to noise ratio is in financial markets and the potential correlation problems of social media. Social media can be used to generate sentiment numbers, but these may change quickly, raising the problem of whether the numbers had a real relationship to what was happening in the market when a trade was processed. In my view, this problem has not yet been adequately solved.”

Simpson continued: “When we think about big data, we think about unstructured data. We might process the data, put algos on a chip to optimise performance and run a machine next to an exchange, but things are always moving so someone needs to be looking at the data. When there is so much data no-one can look at every item, so someone must look for what is not expected to happen. Competitive advantage is in the human element of looking for unusual activity, finding the problem behind it and acting quickly to resolve it.”

These are relatively early days in the development of Big Data applications and tools, a fact borne out by differences of opinion among panellists about the benefits Big Data is delivering. Simpson said Big Data is becoming part of life and is already in use for algo testing and back testing trading realities, while Chwyl said Big Data is providing benefits to the extent that IBM is considering producing multi-tenanted, multiple application shared infrastructure for Big Data. Idelson was more sceptical about the benefits of Big Data, saying: “There is a lot of Big Data analysis going on and banks can make more money, but they seem to be gaining more advantage than customers.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Data platform modernisation: Best practice approaches for unifying data, real time data and automated processing

Date: 17 March 2026 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Financial institutions are evolving their data platform modernisation programmes, moving beyond data-for-cloud capabilities and increasingly towards artificial intelligence-readiness. This has shifted the data management focus in the direction of data unification, real-time delivery and automated governance. The drivers of...

BLOG

DiffusionData Targets Agentic AI in Finance with New MCP Server

Data technology firm DiffusionData has released an open-source server designed to connect Large Language Models (LLMs) with real-time data streams, aiming to facilitate the development of Agentic AI in financial services. The new Diffusion MCP Server uses the Model Context Protocol (MCP), an open standard for AI models to interact with external tools and data...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Tackling the Data Management Challenges of FATCA

As the July 1, 2014 deadline for compliance with the Foreign Account Tax Compliance Act – or FATCA – approaches, financial institutions around the world are working to ensure their data management and operational systems will meet the requirements of the US legislation. This report discusses the requirements of FATCA and how the legislation is...