The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Intelligent Trading with Peter Farley: To Really Get Big Data You Will Need Both Brains and Brawn

Intelligent Trading Technology was created last year to recognise the shifting emphasis in trading rooms that continue to search for “Alpha” and that elusive competitive advantage. This was to both embrace our existing Low Latency community and expertise, along with those rapidly adopting strategies that use “Big Data” and more sophisticated analytics.

Some said that “Big Data” was already old hat, and just another piece of jargon for banks who were having yet another go at putting in place a workable strategy to improve existing challenging data environments. Others said that it wasn’t worth the investment or the effort, with intangible returns from what had become complex and costly initiatives.

Well, it seems the second wave of “Big Data” adoption is now taking off, after massive falls in the cost of the storage and processing power and substantial advances in the capabilities of the smart software that can deliver meaningful answers. And, contrary to some perceptions, this is not just about blending together masses of social media, geopolitical and customer preference data sets to produce new correlations to analyse, but the ability to deliver significant benefits to risk management, trading strategies and profitability.

What is clear is that you cannot have one without the other. According to Actian CEO Steve Shine, if banks tried to use the latest analytics to process massive data sets through existing infrastructure “you would just blow apart traditional platforms.” The key has been to leverage the open source capabilities of the likes of Hadoop or NoSQL, bolted on to the structures already there rather than replacing them.

Shine, speaking before a London seminar last week, cited one example of a bank whose risk platform was taking 12 hours to analyse the organisation’s exposures and liabilities across specific risk categories. It meant that critical decisions were always being taken with information that was 12 hours out of date. But the deployment of best of breed analytics with superior processing power has reduced this to a staggering 2 minutes. As recently as 2-3 years ago this would have been unthinkable, as not only was it not possible to re-invent the legacy architecture, but the cost of an alternative was prohibitive.

In a short space of time, technological advances have driven prices lower, probably by half in each of the last three years to stand at less than 20% of where they were. At the same time more people are beginning to understand the power of “Big Data”, while also not being overawed by the mass of data that needs to be addressed.

Senior executives, said Shine, are also now beginning to ask the right questions that enable meaningful answers to be delivered. “You have to start off by asking where’s the value and how do I turn that into a competitive advantage. What actually lies in all those sets of data domains that I think is important and how do I use it?”

It’s clear that more than a few are putting these capabilities to practical use. One Tier 1 bank (that remains unnamed) has applied these capabilities to its market risk systems, embracing hedging strategies and capital allocation among other variables. As a result of the superior oversight it now has of its exposures it can mitigate risk more accurately and significantly reduce the capital employed against what were perceived as potential liabilities. This delivers massive savings at a time when new regulations are forcing large increase in risk-weighted capital holdings.

Shine notes that “People are now being able to see the value inherent in seemingly uncorrelated data sets. It enables the business to shift from reactive to predictive.” One of the key problems is that the existing underlying technology “just doesn’t scale” said Shine, adding “businesses are going to be much more agile once they become better at managing risk.”

In a trading room environment this is enabling far more widespread use of data that was previously compartmentalised in different areas. “For example,” he said “data that was historically the preserve of the back office for use in post-trade analytics, is now being shifted to the front office where it can enhance decision-making and deliver visual guidance for risk discovery. Finally a holistic view is available.”

But not everyone appears to be convinced. Many large financial institutions are “still addicted to the old business model,” said Shine.

Internal politics and job preservation is often behind this maintenance of the status quo, with many individuals responsible for data management feeling threatened by the potential unleashing of power from these new capabilities. “Too many individuals” he noted, “are still wedded to the code-driven processes of the 1980’s. But today it has been completely modernised without the need for coding at all, which makes the whole process much more accessible.” In addition, there are still half-hearted attempts to fully automate processes, meaning the really big benefits are not delivered.

It’s not surprising therefore that he now sees some “very clever companies” who have worked out how to use the data and expects a continuing exponential increase in adoption over the next 2-3 years. “It’s quite simple,” he said, “If you deal with data strategically you will outperform. The winners will come from those who successfully leverage data and that gap is going to grow.”

So the “Big Data” message is clear. There are no half measures and you cannot just dip your toe in the water. But you also cannot go into it blind. In order to benefit from the combination of “Big Data” and analytics you need to develop an objective and a strategy and commit to both the brains to understand it with the brawn capable of delivering it.

Related content

WEBINAR

Recorded Webinar: Trade/Order Tracking & Time-Stamping for Regulatory Compliance

Don’t miss this opportunity to view the recording of this recently held webinar. At the beginning of 2018 in Europe, and before the end of 2017 in the US, new time-stamping requirements will come into force, under European MiFID II regulation and the US SEC’s Consolidated Audit Trail (CAT) project. The RTS 25 provision of...

BLOG

Lombard Odier Integrates smartTrade’s LiquidityFX to Offer Enhanced FX Trading Services to its Clients

As part of a strategic review aimed at modernising its technology infrastructure and fostering innovation, Swiss bank Lombard Odier & Co. has integrated smartTrade’s LiquidityFX electronic trading platform with its own proprietary wealth management platform, G2, to offer tailored FX trading functionality to its private, institutional and third-party clients. According to David Vincent, smartTrade CEO,...

EVENT

RegTech Summit London

Now in its 5th year, the RegTech Summit in London explores how the European financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

What the Global Legal Entity Identifier (LEI) Will Mean for Your Firm

It’s hard to believe that as early as the 2009 Group of 20 summit in Pittsburgh the industry had recognised the need for greater transparency as part of a wider package of reforms aimed at mitigating the systemic risk posed by the OTC derivatives market. That realisation ultimately led to the Dodd Frank Act, and...