About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Quality and Management Crucial to Effective ESG Analysis

Subscribe to our newsletter

Analytics are helping financial institutes mine crucial insights from ESG data that’s helping to guide their decision-making processes.

Analyses that can surface investment and hedging strategies, shape risk positioning and streamline regulatory compliance all work best when those datasets are properly managed. But ingesting information and making it useable remains a fundamental challenge, especially because ESG data arrives in firms’ systems in a variety of unstructured formats.

Overcoming the data quality and management issues will help ease any obstacles to analysing ESG data, experts told A-Team ESG Insight’s most recent webinar.

With the rapid increase in the volume of ESG data that companies use, financial firms are having to make some far-reaching data management decisions, especially when it comes to mastering and cleaning.

Sourcing data from third-party vendors offers solutions to many of these shortcomings. But they also present their own challenges in the form of incomplete datasets and differing methodologies behind the scores and ratings they provide.

Data Shortage

To overcome these issues, data managers are putting in more work to create data that is rich and vibrant, and in optimal shape for use by the latest powerful analytical tools and software such as machine learning and natural language processing, Nirav Shah, head of ESG analytics at M&G told the “Approaches to ESG data for analytics” webinar.

Among the benefits that enrichment brings is the ability for analysts to trace the linkages between data and financial instruments. This not only offers insight into specific investments but also helps in the creation of models that can estimate instrument- and sector-level values where data isn’t available, Shah said.

Nevertheless, this is prey to a common issue that webinar participants agreed was probably the biggest stumbling block they face – a shortage of data.

Roshini Johri, head of analytics at HSBC, said that to create analytical models, the broadest datasets possible were needed. Because there is a dearth of information emanating from some parts of the world and from within some industrial sectors, the data record was skewed towards geographies and industries that did have good disclosure practices.

Consequently, some of the insight-driving models used by financial institutions are inherently biased towards the markets with best the reporting practices.

Shah agreed, arguing that despite the growing amount of data that companies are using, yet more is needed to fully enrich the datasets needed. Analysts are still a long way from having enough information to form a coherent view of the investments their companies have made or intend to make, he said.

Reliability Importance

Dataset comparability is another fundamental requirement for analytical operations that is being hobbled by data shortages, said Boyke Baboelal, ****strategic solutions director for the Americas at Alveo. To identify relative value in assts and companies, it’s important that analysts are using like-for-like data, but gaps in the data record and the widely differing scores and metrics offered by vendors and other sources makes that difficult.

Getting the data management piece right requires effective data mastering processes and the right technology, participants agreed. If the data is absorbed into the enterprise’s systems in the right shape, then it can be more easily slotted into analytical models and integrated with other, more structured, data.

The provisions HSBC has in place has helped the bank overcome many difficulties, said Johri. They include passing datasets through health checks before it’s fully ingested into the lender’s systems, automating data pipelines to ingest data at regular intervals and establishing a system of alerts for when anomalies are detected in the process.

It’s impossible to build models on data that’s unreliable, she said, making such data hygiene essential.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Streamlining trading and investment processes with data standards and identifiers

3 June 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Financial institutions are integrating not only greater volumes of data for use across their organisation but also more varieties of data. As well, that data is being applied to more use cases than ever before, especially regulatory compliance and ESG integration....

BLOG

Greenwashing Declines for First Time in 6 Years, Says Report with Significance for Investment Data

Greenwashing among companies worldwide has declined in the past year, according to the latest annual report on the topic, providing valuable insight into a key sustainability risk for financial institutions and non-financial corporations. The study, conducted by Switzerland-based ESG risk data specialist RepRisk, examines greenwashing by analyzing publicly available sources. Their report found that there...

EVENT

TradingTech Summit MENA

The inaugural TradingTech Summit MENA takes place in November and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions in the region.

GUIDE

Valuations – Toward On-Demand Evaluated Pricing

Risk and regulatory imperatives are demanding access to the latest portfolio information, placing new pressures on the pricing and valuation function. And the front office increasingly wants up-to-date valuations of hard-to-price securities. These developments are driving a push toward on-demand evaluated pricing capabilities, with pricing teams seeking to provide access to valuations at higher frequency...