About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Intelligent Trading Summit: Making the Most of Big Data

Subscribe to our newsletter

Big Data is growing in both volume and adoption, but it needs to be clean, synchronised, analysed and used sensibly in tackling business problems if it is to reach its full potential in the trading environment.

Discussing Big Data for algo development and analytics at last week’s A-Team Group Intelligent Trading Summit, Peter Farley, director at A-Team Group, questioned an expert panel about the development of Big Data, what it comprises and how it is being used. Setting the scene, Roman Chwyl, worldwide financial services and insurance sales leader, IBM Platform Computing, said: “To make money, traders need to take on more risk and comply with more regulations, which means more computing. Today, big banks are taking on more data and processing, and they are leveraging Big Data tools.”

Nick Idelson, technical director at TraderServe, a provider of real-time trading applications, added: “Big Data is the underlying real-time data used by financial institutions. It grows all the time, but it is important to understand the quality of the data and whether it is contemporaneous. Banks are tackling the issues of Big Data and pulling together vast amounts of structured and unstructured data, including news information. They need to get it all into a system, process it and then use it, perhaps for automated trading or decision support, but they must be sure it is clean, synchronised and used sensibly.”

Picking up on this point, Peter Simpson, senior vice president of research and development at Datawatch, a provider of data visualisation software, said: “Successful implementations of Big Data are around business problems and consider where the data is coming from, how clean it is, how it can be analysed and the required business outcomes. Projects should focus on end business requirements and then work back to the technology stack.”

Turning to the issue of social media as a contributor to Big Data, Idelson explained: “The need is to take into account how low the signal to noise ratio is in financial markets and the potential correlation problems of social media. Social media can be used to generate sentiment numbers, but these may change quickly, raising the problem of whether the numbers had a real relationship to what was happening in the market when a trade was processed. In my view, this problem has not yet been adequately solved.”

Simpson continued: “When we think about big data, we think about unstructured data. We might process the data, put algos on a chip to optimise performance and run a machine next to an exchange, but things are always moving so someone needs to be looking at the data. When there is so much data no-one can look at every item, so someone must look for what is not expected to happen. Competitive advantage is in the human element of looking for unusual activity, finding the problem behind it and acting quickly to resolve it.”

These are relatively early days in the development of Big Data applications and tools, a fact borne out by differences of opinion among panellists about the benefits Big Data is delivering. Simpson said Big Data is becoming part of life and is already in use for algo testing and back testing trading realities, while Chwyl said Big Data is providing benefits to the extent that IBM is considering producing multi-tenanted, multiple application shared infrastructure for Big Data. Idelson was more sceptical about the benefits of Big Data, saying: “There is a lot of Big Data analysis going on and banks can make more money, but they seem to be gaining more advantage than customers.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Enhancing trader efficiency with interoperability – Innovative solutions for automated and streamlined trader desktop and workflows

Traders today are expected to navigate increasingly complex markets using workflows that often lag behind the pace of change. Disconnected systems, manual processes, and fragmented user experiences create hidden inefficiencies that directly impact performance and risk management. Firms that can streamline and modernise the trader desktop are gaining a tangible edge – both in speed...

BLOG

The New Shape of Market Data: Why Institutions Are Moving Toward a More Modular, Machine-Readable Architecture

For decades, the market-data ecosystem has been defined by reliance on a handful of dominant vendors. Their breadth, depth and entitlements frameworks became foundational to both the trading desk and the wider enterprise. But the requirements of the modern financial technology stack have shifted dramatically. Cloud-native development, agentic AI workflows, and a proliferation of analytics-driven...

EVENT

AI in Capital Markets Summit London

Now in its 2nd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

The Global LEI System – A Solution for Entity Data?

The Global LEI System – or GLEIS – has been in development since the middle of last year. Development has been patchy at times, but much has been done, leaving fewer outstanding issues, but also raising new questions. What’s emerging is a structure for the GLEIS going forward, complete with a mechanism for registering and...