About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Quality and Management Crucial to Effective ESG Analysis

Subscribe to our newsletter

Analytics are helping financial institutes mine crucial insights from ESG data that’s helping to guide their decision-making processes.

Analyses that can surface investment and hedging strategies, shape risk positioning and streamline regulatory compliance all work best when those datasets are properly managed. But ingesting information and making it useable remains a fundamental challenge, especially because ESG data arrives in firms’ systems in a variety of unstructured formats.

Overcoming the data quality and management issues will help ease any obstacles to analysing ESG data, experts told A-Team ESG Insight’s most recent webinar.

With the rapid increase in the volume of ESG data that companies use, financial firms are having to make some far-reaching data management decisions, especially when it comes to mastering and cleaning.

Sourcing data from third-party vendors offers solutions to many of these shortcomings. But they also present their own challenges in the form of incomplete datasets and differing methodologies behind the scores and ratings they provide.

Data Shortage

To overcome these issues, data managers are putting in more work to create data that is rich and vibrant, and in optimal shape for use by the latest powerful analytical tools and software such as machine learning and natural language processing, Nirav Shah, head of ESG analytics at M&G told the “Approaches to ESG data for analytics” webinar.

Among the benefits that enrichment brings is the ability for analysts to trace the linkages between data and financial instruments. This not only offers insight into specific investments but also helps in the creation of models that can estimate instrument- and sector-level values where data isn’t available, Shah said.

Nevertheless, this is prey to a common issue that webinar participants agreed was probably the biggest stumbling block they face – a shortage of data.

Roshini Johri, head of analytics at HSBC, said that to create analytical models, the broadest datasets possible were needed. Because there is a dearth of information emanating from some parts of the world and from within some industrial sectors, the data record was skewed towards geographies and industries that did have good disclosure practices.

Consequently, some of the insight-driving models used by financial institutions are inherently biased towards the markets with best the reporting practices.

Shah agreed, arguing that despite the growing amount of data that companies are using, yet more is needed to fully enrich the datasets needed. Analysts are still a long way from having enough information to form a coherent view of the investments their companies have made or intend to make, he said.

Reliability Importance

Dataset comparability is another fundamental requirement for analytical operations that is being hobbled by data shortages, said Boyke Baboelal, ****strategic solutions director for the Americas at Alveo. To identify relative value in assts and companies, it’s important that analysts are using like-for-like data, but gaps in the data record and the widely differing scores and metrics offered by vendors and other sources makes that difficult.

Getting the data management piece right requires effective data mastering processes and the right technology, participants agreed. If the data is absorbed into the enterprise’s systems in the right shape, then it can be more easily slotted into analytical models and integrated with other, more structured, data.

The provisions HSBC has in place has helped the bank overcome many difficulties, said Johri. They include passing datasets through health checks before it’s fully ingested into the lender’s systems, automating data pipelines to ingest data at regular intervals and establishing a system of alerts for when anomalies are detected in the process.

It’s impossible to build models on data that’s unreliable, she said, making such data hygiene essential.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Strategies and solutions for unlocking value from unstructured data

Unstructured data accounts for a growing proportion of the information that capital markets participants are using in their day-to-day operations. Technology – especially generative artificial intelligence (GenAI) – is enabling organisations to prise crucial insights from sources – such as social media posts, news articles and sustainability and company reports – that were all but...

BLOG

Ensuring Data Integrity in Finance – A Foundation for Efficiency and Trust

By Neil Sandle, director of product management at Gresham. In today’s financial landscape, data integrity is more than a regulatory requirement — it is the backbone of efficient operations and trustworthy decision making. Ensuring that data remains accurate, consistent, and reliable throughout its lifecycle is essential for financial institutions looking to maintain operational excellence, manage...

EVENT

TradingTech Summit London

Now in its 14th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

A-Team Group’s Valuations Vendor Directory 2009

An indispensable guide to valuations professionals seeking providers of services in the asset valuations market. A-Team Group’s latest release in its series of directories – available for FREE download – focuses on vendors of valuations data, models and analytics. But this is not just another list of firms with their telephone numbers – you can get that...