About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Bringing Alternative Data into the Mainstream

Subscribe to our newsletter

By Martijn Groot, Vice President of Marketing and Strategy at Asset Control.

Alternative and unstructured data is rapidly going mainstream. In a recent survey conducted by Adox Research for Asset Control, more than a third of respondents from financial institutions (36%) labelled these new types of data as ‘high importance’ drivers for investment. Alternative data inventory guides and consulting firms are growing in numbers too.

The appeal of alternative data to financial services organisations is clear as it can change the ‘data game’ within these firms and add insight and information not available via traditional content products. The ability of firms to ‘travel across’ different types of data and put them in a meaningful context not only for generating alpha, but also for understanding their current operational status, is an attractive option for any financial services business. That, in a nutshell, explains the growing interest in alternative data. After all, one of the definitions of intelligence and learning is the ability to make new connections.

We are seeing alternative data used to help shape investment decisions. Many quant funds use alternative data to gain an advantage in market insight. But while this kind of data has huge potential there are big gaps in terms of how it is being used today and how it could be in the future. Alternative data is often not structured and organised in the same way as traditional data feeds. Part of the curation job that data vendors do is shifted in-house.

Plotting the Route Forward

Currently though, while more and more alternative or unstructured data is coming into financial services organisations, there is little activity focused on mastering and structuring. In line with the industry shift from data warehouses to data lakes, the work effort between inbound and outbound has shifted. Traditionally, in the data warehouse era, data pre-processing took place to cast the data sets into a predefined schema, making querying very easy once the data was in. In data lakes, data ingestion is fast and easy in the absence of schema constraints, but more effort has to be put into the querying side to connect and process the data to get the answers hidden in the data sets.

That is limiting the insight firms can achieve in using data to address complex problems and it is now starting to give them serious pause for thought. If they are looking at a particular company, for example, how do they bring the structured (traditional equity, CDS, corporate bond data sets) and unstructured data (indirect estimates on sales figures, earnings calls transcripts, news feeds, social media) they have on it together, join it up, and look at all of it as part of a single source in order to make the most informed business decision? The answer is in machine learning techniques that can systemically link data sets, detect outliers and help users get to any answers hidden in the data.

Where we are seeing the most significant developments is in rapidly closing the gap in the tools to integrate these data sources into day-to-day business workflow processes – overcoming the key challenge of bringing structured and unstructured data together. Crudely put, if you can’t put a data source to use, interest in it will rapidly fade.

Adoption areas can range from compliance (early use cases were studying behavioural patterns in large transaction data sets) to gaining an investment edge. In the latter case, there is an analogy that can be drawn between using alternative data sets and building an increasingly accurate and more complete model, map or representation of the financial world. It is rather like moving from a simple compass-based approach to the latest most technologically advanced satnav. Alternative data sets provide additional detail or even additional angles for users to explore.

Furthermore, the tooling required to navigate and map this new data world is increasingly growing in order to prevent users from getting lost and effectively not seeing the wood for the trees. Putting this kind of tooling in place is important. Imposing structure, inferring connections between datasets in structured and unstructured environments and detecting patterns across all these areas is the task of data management. It drives insight and will make the new datasets actionable. A further important area of use cases is risk assessments. The data intensity of risk and reporting processes is likely to continue to grow.

While the amount of alternative data available to financial services firms is growing all the time – and the number of use cases escalating, there is still a lot of work to do to harness and more efficiently manage alternative data and to integrate  or at least align it with more structured data to gain a comprehensive insight into patterns and trends. It will take time for financial services to realise the potential that alternative data offers, but with tools and solutions becoming increasingly capable of aligning unstructured with structured data, the future for this new data class as a driver of business value looks very bright indeed.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to optimise SaaS data management solutions

Software-as-a-Service (SaaS) data management solutions go hand-in-hand with cloud technology, delivering not only SaaS benefits of agility, a reduced on-premise footprint and access to third-party expertise, but also the fast data delivery, productivity and efficiency gains provided by the cloud. This webinar will focus on the essentials of SaaS data management, including practical guidance on...

BLOG

big xyt Partners with Baillie Gifford to Launch Portfolio Liquidity Analysis Solution for Dilution Levy Calculation

big xyt, the independent data and analytics solutions provider, has launched a new tool to automate the process of dilution. The Portfolio Liquidity Analysis solution, developed in collaboration with Baillie Gifford, is designed to enhance buy-side firms’ understanding of equity portfolio liquidity and to address the forthcoming industry guidance on the application of dilution levies....

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

The Data Management Implications of Solvency II

Bombarded by a barrage of incoming regulations, data managers in Europe are looking for the ‘golden copy’ of regulatory requirements: the compliance solution that will give them most bang for the buck in meeting the demands of the rest of the regulations they are faced with. Solvency II may come close as this ‘golden regulation’:...