About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data and Research: Two Sides of the Same Coin

Subscribe to our newsletter

By Evan Schnidman, President of Prattle, a Liquidnet company.

For decades, the asset management community has treated research and data as two distinct goods – separate commodities to be controlled by separate institutions. Data was the domain of Bloomberg, Thomson Reuters (now Refinitiv), FactSet, S&P, and others, while research was the domain of large banks and boutique providers whose analysts drafted copious long-form notes for human consumption.

But things have changed. The rapid spread of ‘alternative data’ has highlighted the inescapable fact that research and data are both crucial parts of the information slurry necessary to make sound investment decisions. Progressive-thinking asset management firms have realised that research and data increasingly go hand-in-hand, and are leveraging this synergy to their advantage.

Data is an increasingly broad category, and research is increasingly quantitative. The current breadth of data can be traced back to the advent of what is now known as alternative data. Corporate exhaust data, purpose-built datasets, and the quantification of previously qualitative information have led to an explosion of datasets that can be used to both streamline the research process and provide quantitative rationale for decisions previously driven solely by qualitative information. This explosion in information has empowered portfolio managers and analysts to become increasingly ‘quantamental’, using a blend of data and research for optimal results.

With so many providers offering such a wide range of information in today’s market, defining alternative data is a nearly impossible task. That said, one fact remains clear: alternative data is much more similar to research than it is to market data. Where market data – such as price information – is purely a commodity necessary to conduct business in financial services, alternative data – and the investment signals it yields – remains far more open to interpretation and is thus more akin to research. In fact, much purpose-built alternative data is designed specifically to streamline the traditional research process. For instance, Prattle provides a tool to automatically extract the most salient remarks from company executives on each earnings call, essentially readymade quotations for a long-form research report.

This Prattle Core Comments feature is just one example of the ways that alternative data can serve to streamline the research process, but it is worth noting that these types of tools can also assist quants and fundamentals in reconciling their respective trading strategies. While quants are known to be purely systematic, striving to be devoid of bias, they often operate with incomplete information. On the other hand, fundamental managers often argue that their ability to assimilate both quantitative and qualitative information allows them to get closer to making investment decisions based on complete information, but their interpretation of that information is fraught with bias.

That is where alternative data comes in. By applying Natural Language Processing (NLP) coupled with machine learning and AI tools, unstructured, qualitative information, such as corporate earnings calls, can now be analysed in an unbiased, quantitative way. Thereby content becomes data. This data can not only be used as a directly tradable signal to the quant, it can also serve as a check on the fundamental manager’s inherent cognitive bias. Applying technology in this manner can help strategies perform better, while also allowing quants and fundamentals to speak the same language.

The blurry line between research and data prompted Prattle to begin using the term ‘research automation’. Although this term has not caught on yet, it remains the most accurate way to describe the very best forms of alternative data. Whereas 20 years ago investors would do research by driving around to count the number of cars in Home Depot parking lots, now they can simply purchase satellite data with those same counts. This means that precisely the same information, the number of cars in parking lots, has transformed from research produced manually by humans into data produced automatically by machines. Such fluidity further highlights the blurriness of the line between research and data.

It is apparent that research and data are increasingly a distinction without a difference. Portfolio managers and analysts need both kinds of information to make informed investment decisions. The confluence of evolving AI and NLP technology with the rapid explosion in available and easily stored data has empowered asset managers to rethink how they procure information. Widespread adoption of these tools has the potential to increase returns, lower operating costs and transform active asset management.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The emerging structure of the institutional digital assets market

As interest in trading digital assets continues to increase among institutional investors, so too does the need to focus on market structure, regulation and trading solutions. For financial institutions that get it right the rewards will be significant, but it is not necessarily easy and the challenges are many. This webinar will consider how digital...

BLOG

Fenics Partners with ZE PowerGroup to Distribute Energy and Commodities Market Data

In a strategic move, Fenics Market Data, a division of BGC Group, Inc., has partnered with data integration and analytics platform provider ZE PowerGroup, to offer its energy and commodities pricing data to clients through ZEMA, ZE’s data management solution. The initiative will further boost BGC’s energy and commodities services, following in the wake of...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

High Performance Technologies for Trading

The highly specialised realm of high frequency trading without doubt is a great driver for a range of high performance technologies that are becoming essential tools for Wall Street. More so than the now somewhat pedestrian algorithmic trading and analytics/pricing applications that are usually cited as the reason that HPC is hitting the financial markets,...