The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Practitioners Discuss Use of Cloud Technologies for Signal Extraction

Capital markets firms’ appetite for cloud technologies is growing apace, driven by the promise of improved efficiency, reduced costs, faster development of new products and services, and the ability to do all of this at scale.

Cloud usage within the financial markets is growing across a wide range of business areas. Firms are exploring how they can utilise the cloud to ingest and analyse market data and new sources of information (including a fast-growing assortment of alternative data) to enhance the investment process and discover potential new sources of alpha.

This development – essentially the acceptance of cloud technologies by business areas long considered to be unsuitable – was the subject of a recent webinar hosted by ICE Data Services and featuring guest panellists from Amazon Web Services and SIGTech, a technology spinoff from the investment management firm Brevan Howard Asset management. The discussion offered good insights into how quant groups can use cloud-hosted infrastructure to access data and analytics, and process that data to produce trading and investment signals.

Current Trends

The webinar panellists – Balaji Gopalan, Principal Solution Architect at Amazon Web Services (AWS), Chao Yan, Chief Technology Officer at SIGTech and Satish Vedantam, Director of Quantitative Engineering at ICE Data Services- highlighted five key trends that are driving further adoption of cloud amongst quantitative investment firms.

First, is a marked increase in confidence in cloud security. Until relatively recently, there was a general lack of confidence in the security of the cloud. That is now changing, as capital markets firms across the industry recognise and trust the security and compliance of cloud solutions. Some regulators, including FINRA in the US, are even migrating their own workloads to the cloud.

The second observation was the need to scale. Last year saw unprecedented spikes in both volume and volatility across asset classes, as a result of the global pandemic and other world events. In this environment, firms using the cloud have been able to successfully scale and adapt to the ‘new normal’ of employees and traders working from home.

The third trend identified by panellists was the need for increased network connectivity and integration. As financial markets become more interdependent, market participants need to be able to connect with each other and with their customers rapidly and at scale. Cloud can help facilitate this.

Fourth, panellists noted an increasing scope for innovation. As firms look to develop and bring to market new products and services, the cloud gives them the ability to experiment and either fail fast, or quickly scale up if a new offering is successful.

Finally, there is growing use of AI and machine learning. The cloud presents a wide range of opportunities for data scientists to use AI and ML for analysing and processing large data sets, across many business functions.

Data Quality

With the expansion of new data sources, panellists discussed various approaches that firms can take to improve the quality of the data they are sourcing via the cloud. One is to measure the accuracy and timeliness of the data against recognised benchmarks. Another is to build a robust set of challenge criteria and challenge processes. Data can also be mapped against, and integrated with, existing reference data.

There are also a number of data validation tools that can be used to cleanse data as it is being ingested, and to categorise it with appropriate metadata. Cloud operator AWS, for example, works with various partners to offer solutions that can make data available to users in a cleaner and more accessible way. Its AWS PrivateLink service offers secure interconnectivity between data sources and data consumers via the AWS backbone, ensuring that data is not contaminated by being exposed to the public internet. And AWS Data Exchange allows users to discover and subscribe to ‘clean’ third-party data sets, which is particularly useful for alternative data.

These types of solutions, the audience heard, help address some of the challenges end-users face when using the cloud to integrate and unify data from multiple data vendors through a single platform or interface.

Use of new platforms and technologies

In recent years, there’s been a steady evolution from file-based to more real-time data access through the use of APIs. Increasingly, applications are accessing cloud data warehouses, which offer virtually unlimited scalability of data storage as opposed to in-house data warehouses, which are more difficult to scale.

This approach gives firms the ability to analyse petabytes of data using tools like Amazon Athena, Google Cloud BigQuery or Presto. It also enables firms to adopt a ‘no ops’ strategy, because they are not running their own data infrastructure. This then gives an added advantage of firms being able to operate in a cloud-agnostic way, selecting the appropriate cloud service provider to match the relevant workload.

Other cloud technologies that firms are increasingly using are things like GPU on demand, which enables massive parallel processing at scale and is essential for some ML applications. Streaming analytics via platforms like KX’s kdb+ are also now available on the cloud, via the AWS Marketplace, for example, providing firms with a clear migration path to the cloud without having to re-code applications.

From an AI and ML perspective, firms can use tools like Amazon SageMaker Autopilot to automatically build, train and tune their own machine learning models based upon multiple sources of data. This type of technology is increasingly being used by quantitative investment managers to identify trading signals.

It was clear from the webinar discussion that the public cloud continues to offer new and exciting opportunities for quant teams to develop, test and fine tune their trading and investment strategies.

Related content

WEBINAR

Upcoming Webinar: Infrastructure monitoring: mapping technical performance to business performance

Date: 8 July 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes It’s a widely recognized truth that if you can’t measure something, you can’t improve its performance. As high-performance connectivity technologies have established themselves in the mainstream of financial firms’ trading architectures, the ability to monitor messaging and data infrastructures...

BLOG

NeoXam Partners with MDP to Grow Market Data Footprint in DACH Region

Paris-based data management platform provider NeoXam has forged an alliance with MDP – a new market data consultancy founded by former executives of Screen Group, now part of TRG Screen – to extend the reach of its DataHub platform in the DACH region comprising Germany, Austria and Switzerland. The initiative will focus on helping clients...

EVENT

RegTech Summit Virtual

The RegTech Summit Virtual is a global online event that will be held in June 2021 with an exceptional guest speaker line up of RegTech practitioners, regulators, start-ups and solution providers to collaborate and discuss innovative and effective approaches for building a better regulatory environment.

GUIDE

Entity Data Management Handbook – Fifth Edition

Welcome to the fifth edition of A-Team Group’s Entity Data Management Handbook, sponsored for the fourth year running by entity data specialist Bureau van Dijk, a Moody’s Analytics Company. The past year has seen a crackdown on corporate responsibility for financial crime – with financial firms facing draconian fines for non-compliance and the very real...