About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Intelligent Trading with Peter Farley: To Really Get Big Data You Will Need Both Brains and Brawn

Subscribe to our newsletter

Intelligent Trading Technology was created last year to recognise the shifting emphasis in trading rooms that continue to search for “Alpha” and that elusive competitive advantage. This was to both embrace our existing Low Latency community and expertise, along with those rapidly adopting strategies that use “Big Data” and more sophisticated analytics.

Some said that “Big Data” was already old hat, and just another piece of jargon for banks who were having yet another go at putting in place a workable strategy to improve existing challenging data environments. Others said that it wasn’t worth the investment or the effort, with intangible returns from what had become complex and costly initiatives.

Well, it seems the second wave of “Big Data” adoption is now taking off, after massive falls in the cost of the storage and processing power and substantial advances in the capabilities of the smart software that can deliver meaningful answers. And, contrary to some perceptions, this is not just about blending together masses of social media, geopolitical and customer preference data sets to produce new correlations to analyse, but the ability to deliver significant benefits to risk management, trading strategies and profitability.

What is clear is that you cannot have one without the other. According to Actian CEO Steve Shine, if banks tried to use the latest analytics to process massive data sets through existing infrastructure “you would just blow apart traditional platforms.” The key has been to leverage the open source capabilities of the likes of Hadoop or NoSQL, bolted on to the structures already there rather than replacing them.

Shine, speaking before a London seminar last week, cited one example of a bank whose risk platform was taking 12 hours to analyse the organisation’s exposures and liabilities across specific risk categories. It meant that critical decisions were always being taken with information that was 12 hours out of date. But the deployment of best of breed analytics with superior processing power has reduced this to a staggering 2 minutes. As recently as 2-3 years ago this would have been unthinkable, as not only was it not possible to re-invent the legacy architecture, but the cost of an alternative was prohibitive.

In a short space of time, technological advances have driven prices lower, probably by half in each of the last three years to stand at less than 20% of where they were. At the same time more people are beginning to understand the power of “Big Data”, while also not being overawed by the mass of data that needs to be addressed.

Senior executives, said Shine, are also now beginning to ask the right questions that enable meaningful answers to be delivered. “You have to start off by asking where’s the value and how do I turn that into a competitive advantage. What actually lies in all those sets of data domains that I think is important and how do I use it?”

It’s clear that more than a few are putting these capabilities to practical use. One Tier 1 bank (that remains unnamed) has applied these capabilities to its market risk systems, embracing hedging strategies and capital allocation among other variables. As a result of the superior oversight it now has of its exposures it can mitigate risk more accurately and significantly reduce the capital employed against what were perceived as potential liabilities. This delivers massive savings at a time when new regulations are forcing large increase in risk-weighted capital holdings.

Shine notes that “People are now being able to see the value inherent in seemingly uncorrelated data sets. It enables the business to shift from reactive to predictive.” One of the key problems is that the existing underlying technology “just doesn’t scale” said Shine, adding “businesses are going to be much more agile once they become better at managing risk.”

In a trading room environment this is enabling far more widespread use of data that was previously compartmentalised in different areas. “For example,” he said “data that was historically the preserve of the back office for use in post-trade analytics, is now being shifted to the front office where it can enhance decision-making and deliver visual guidance for risk discovery. Finally a holistic view is available.”

But not everyone appears to be convinced. Many large financial institutions are “still addicted to the old business model,” said Shine.

Internal politics and job preservation is often behind this maintenance of the status quo, with many individuals responsible for data management feeling threatened by the potential unleashing of power from these new capabilities. “Too many individuals” he noted, “are still wedded to the code-driven processes of the 1980’s. But today it has been completely modernised without the need for coding at all, which makes the whole process much more accessible.” In addition, there are still half-hearted attempts to fully automate processes, meaning the really big benefits are not delivered.

It’s not surprising therefore that he now sees some “very clever companies” who have worked out how to use the data and expects a continuing exponential increase in adoption over the next 2-3 years. “It’s quite simple,” he said, “If you deal with data strategically you will outperform. The winners will come from those who successfully leverage data and that gap is going to grow.”

So the “Big Data” message is clear. There are no half measures and you cannot just dip your toe in the water. But you also cannot go into it blind. In order to benefit from the combination of “Big Data” and analytics you need to develop an objective and a strategy and commit to both the brains to understand it with the brawn capable of delivering it.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlock the Future of European Equities & Trading Technology: 2024 and Beyond!

In a world where the financial landscape is perpetually evolving, 2023 has brought widespread discussions around liquidity, regulatory shifts in the EU and UK, and advancements like the consolidated tape in Europe. For the year ahead in 2024, the European market is poised for transformative changes that will influence the future of trading technology and...

BLOG

Overcoming the Challenges of Expanding into New Markets and Moving Towards the Cloud: The BSO Blueprint

For trading firms who see diversification as a means of uncovering alpha, the challenges of entering new markets can be daunting. From navigating unfamiliar regulatory landscapes to ensuring optimal connectivity, the journey is fraught with potential pitfalls. In this Q&A, we talk to Michael Ourabah, CEO of Infrastructure & connectivity provider BSO, about how firms...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Valuations – Toward On-Demand Evaluated Pricing

Risk and regulatory imperatives are demanding access to the latest portfolio information, placing new pressures on the pricing and valuation function. And the front office increasingly wants up-to-date valuations of hard-to-price securities. These developments are driving a push toward on-demand evaluated pricing capabilities, with pricing teams seeking to provide access to valuations at higher frequency...