About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Intelligent Trading with Peter Farley: To Really Get Big Data You Will Need Both Brains and Brawn

Subscribe to our newsletter

Intelligent Trading Technology was created last year to recognise the shifting emphasis in trading rooms that continue to search for “Alpha” and that elusive competitive advantage. This was to both embrace our existing Low Latency community and expertise, along with those rapidly adopting strategies that use “Big Data” and more sophisticated analytics.

Some said that “Big Data” was already old hat, and just another piece of jargon for banks who were having yet another go at putting in place a workable strategy to improve existing challenging data environments. Others said that it wasn’t worth the investment or the effort, with intangible returns from what had become complex and costly initiatives.

Well, it seems the second wave of “Big Data” adoption is now taking off, after massive falls in the cost of the storage and processing power and substantial advances in the capabilities of the smart software that can deliver meaningful answers. And, contrary to some perceptions, this is not just about blending together masses of social media, geopolitical and customer preference data sets to produce new correlations to analyse, but the ability to deliver significant benefits to risk management, trading strategies and profitability.

What is clear is that you cannot have one without the other. According to Actian CEO Steve Shine, if banks tried to use the latest analytics to process massive data sets through existing infrastructure “you would just blow apart traditional platforms.” The key has been to leverage the open source capabilities of the likes of Hadoop or NoSQL, bolted on to the structures already there rather than replacing them.

Shine, speaking before a London seminar last week, cited one example of a bank whose risk platform was taking 12 hours to analyse the organisation’s exposures and liabilities across specific risk categories. It meant that critical decisions were always being taken with information that was 12 hours out of date. But the deployment of best of breed analytics with superior processing power has reduced this to a staggering 2 minutes. As recently as 2-3 years ago this would have been unthinkable, as not only was it not possible to re-invent the legacy architecture, but the cost of an alternative was prohibitive.

In a short space of time, technological advances have driven prices lower, probably by half in each of the last three years to stand at less than 20% of where they were. At the same time more people are beginning to understand the power of “Big Data”, while also not being overawed by the mass of data that needs to be addressed.

Senior executives, said Shine, are also now beginning to ask the right questions that enable meaningful answers to be delivered. “You have to start off by asking where’s the value and how do I turn that into a competitive advantage. What actually lies in all those sets of data domains that I think is important and how do I use it?”

It’s clear that more than a few are putting these capabilities to practical use. One Tier 1 bank (that remains unnamed) has applied these capabilities to its market risk systems, embracing hedging strategies and capital allocation among other variables. As a result of the superior oversight it now has of its exposures it can mitigate risk more accurately and significantly reduce the capital employed against what were perceived as potential liabilities. This delivers massive savings at a time when new regulations are forcing large increase in risk-weighted capital holdings.

Shine notes that “People are now being able to see the value inherent in seemingly uncorrelated data sets. It enables the business to shift from reactive to predictive.” One of the key problems is that the existing underlying technology “just doesn’t scale” said Shine, adding “businesses are going to be much more agile once they become better at managing risk.”

In a trading room environment this is enabling far more widespread use of data that was previously compartmentalised in different areas. “For example,” he said “data that was historically the preserve of the back office for use in post-trade analytics, is now being shifted to the front office where it can enhance decision-making and deliver visual guidance for risk discovery. Finally a holistic view is available.”

But not everyone appears to be convinced. Many large financial institutions are “still addicted to the old business model,” said Shine.

Internal politics and job preservation is often behind this maintenance of the status quo, with many individuals responsible for data management feeling threatened by the potential unleashing of power from these new capabilities. “Too many individuals” he noted, “are still wedded to the code-driven processes of the 1980’s. But today it has been completely modernised without the need for coding at all, which makes the whole process much more accessible.” In addition, there are still half-hearted attempts to fully automate processes, meaning the really big benefits are not delivered.

It’s not surprising therefore that he now sees some “very clever companies” who have worked out how to use the data and expects a continuing exponential increase in adoption over the next 2-3 years. “It’s quite simple,” he said, “If you deal with data strategically you will outperform. The winners will come from those who successfully leverage data and that gap is going to grow.”

So the “Big Data” message is clear. There are no half measures and you cannot just dip your toe in the water. But you also cannot go into it blind. In order to benefit from the combination of “Big Data” and analytics you need to develop an objective and a strategy and commit to both the brains to understand it with the brawn capable of delivering it.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to make the most of market data

Date: 23 May 2023 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Market data means different things to different organisations, it can be extremely expensive, and often comes with limited licensing contracts. It must be carefully sourced, governed, quality checked, and integrated. And it must be fit for purpose across a...

BLOG

Tradeweb & S&P Global Market Intelligence Collaborate to Link Up European Primary & Secondary Bond Markets

Tradeweb, the electronic marketplace operator, has collaborated with S&P Global Market Intelligence, the financial information services provider, to introduce electronic connectivity between primary and secondary markets for European credit, covered, sovereign, supranational and agency (SSA) bonds. With the integration of InvestorAccess, S&P Global Market Intelligence’s digital primary market platform, Tradeweb clients will now be able...

EVENT

A-Team Briefing: Cloud Innovation for Data Ops

This Innovation Briefing will explore approaches to data infrastructure transformation, technologies required and how to make sure processes are optimised to support real time data management. Hear from leading practitioners and innovative technology solution providers who will share insight into how to set up and leverage your data infrastructure to provide user access to consistent data and analytics, and companies the ability to monetise their data.

GUIDE

Entity Data Management Handbook – Second Edition

Entity data management is this year’s hot topic as financial firms focus on entity data to gain a better understanding of customers, improve risk management and meet regulatory compliance requirements. Data management programmes that enrich the Legal Entity Identifier with hierarchy data and links to other datasets can also add real value, including new business...