About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Intelligent Trading with Peter Farley: To Really Get Big Data You Will Need Both Brains and Brawn

Subscribe to our newsletter

Intelligent Trading Technology was created last year to recognise the shifting emphasis in trading rooms that continue to search for “Alpha” and that elusive competitive advantage. This was to both embrace our existing Low Latency community and expertise, along with those rapidly adopting strategies that use “Big Data” and more sophisticated analytics.

Some said that “Big Data” was already old hat, and just another piece of jargon for banks who were having yet another go at putting in place a workable strategy to improve existing challenging data environments. Others said that it wasn’t worth the investment or the effort, with intangible returns from what had become complex and costly initiatives.

Well, it seems the second wave of “Big Data” adoption is now taking off, after massive falls in the cost of the storage and processing power and substantial advances in the capabilities of the smart software that can deliver meaningful answers. And, contrary to some perceptions, this is not just about blending together masses of social media, geopolitical and customer preference data sets to produce new correlations to analyse, but the ability to deliver significant benefits to risk management, trading strategies and profitability.

What is clear is that you cannot have one without the other. According to Actian CEO Steve Shine, if banks tried to use the latest analytics to process massive data sets through existing infrastructure “you would just blow apart traditional platforms.” The key has been to leverage the open source capabilities of the likes of Hadoop or NoSQL, bolted on to the structures already there rather than replacing them.

Shine, speaking before a London seminar last week, cited one example of a bank whose risk platform was taking 12 hours to analyse the organisation’s exposures and liabilities across specific risk categories. It meant that critical decisions were always being taken with information that was 12 hours out of date. But the deployment of best of breed analytics with superior processing power has reduced this to a staggering 2 minutes. As recently as 2-3 years ago this would have been unthinkable, as not only was it not possible to re-invent the legacy architecture, but the cost of an alternative was prohibitive.

In a short space of time, technological advances have driven prices lower, probably by half in each of the last three years to stand at less than 20% of where they were. At the same time more people are beginning to understand the power of “Big Data”, while also not being overawed by the mass of data that needs to be addressed.

Senior executives, said Shine, are also now beginning to ask the right questions that enable meaningful answers to be delivered. “You have to start off by asking where’s the value and how do I turn that into a competitive advantage. What actually lies in all those sets of data domains that I think is important and how do I use it?”

It’s clear that more than a few are putting these capabilities to practical use. One Tier 1 bank (that remains unnamed) has applied these capabilities to its market risk systems, embracing hedging strategies and capital allocation among other variables. As a result of the superior oversight it now has of its exposures it can mitigate risk more accurately and significantly reduce the capital employed against what were perceived as potential liabilities. This delivers massive savings at a time when new regulations are forcing large increase in risk-weighted capital holdings.

Shine notes that “People are now being able to see the value inherent in seemingly uncorrelated data sets. It enables the business to shift from reactive to predictive.” One of the key problems is that the existing underlying technology “just doesn’t scale” said Shine, adding “businesses are going to be much more agile once they become better at managing risk.”

In a trading room environment this is enabling far more widespread use of data that was previously compartmentalised in different areas. “For example,” he said “data that was historically the preserve of the back office for use in post-trade analytics, is now being shifted to the front office where it can enhance decision-making and deliver visual guidance for risk discovery. Finally a holistic view is available.”

But not everyone appears to be convinced. Many large financial institutions are “still addicted to the old business model,” said Shine.

Internal politics and job preservation is often behind this maintenance of the status quo, with many individuals responsible for data management feeling threatened by the potential unleashing of power from these new capabilities. “Too many individuals” he noted, “are still wedded to the code-driven processes of the 1980’s. But today it has been completely modernised without the need for coding at all, which makes the whole process much more accessible.” In addition, there are still half-hearted attempts to fully automate processes, meaning the really big benefits are not delivered.

It’s not surprising therefore that he now sees some “very clever companies” who have worked out how to use the data and expects a continuing exponential increase in adoption over the next 2-3 years. “It’s quite simple,” he said, “If you deal with data strategically you will outperform. The winners will come from those who successfully leverage data and that gap is going to grow.”

So the “Big Data” message is clear. There are no half measures and you cannot just dip your toe in the water. But you also cannot go into it blind. In order to benefit from the combination of “Big Data” and analytics you need to develop an objective and a strategy and commit to both the brains to understand it with the brawn capable of delivering it.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: High-Performance Networks & Low-Latency Connectivity for Trading

10 June 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes With financial markets becoming more complex and interconnected in today’s electronic trading environment, trading firms, exchanges, and infrastructure providers need to continually push the boundaries of network performance to stay ahead. Ultra-low latency, seamless connectivity, and resilient infrastructure are no longer...

BLOG

What to Expect at A-Team Group’s Second Buy AND Build Summit

On September 19th, Buy AND Build: The Future of Capital Markets Technology returns to London at the Marriott Hotel, Canary Wharf for its second year. This A-Team Group event offers a timely exploration of how financial institutions and technology providers can collaborate more effectively to modernise trading platforms and drive innovation. As firms increasingly transition...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Institutional Digital Assets Handbook 2024

Despite the setback of the FTX collapse, institutional interest in digital assets has grown markedly in the past 12 months, with firms of all sizes now acknowledging participation in some form. While as recently as a year ago, institutional trading firms were taking a cautious stance toward their use, the acceptance of tokenisation, stablecoins, and...