About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Quality and Management Crucial to Effective ESG Analysis

Subscribe to our newsletter

Analytics are helping financial institutes mine crucial insights from ESG data that’s helping to guide their decision-making processes.

Analyses that can surface investment and hedging strategies, shape risk positioning and streamline regulatory compliance all work best when those datasets are properly managed. But ingesting information and making it useable remains a fundamental challenge, especially because ESG data arrives in firms’ systems in a variety of unstructured formats.

Overcoming the data quality and management issues will help ease any obstacles to analysing ESG data, experts told A-Team ESG Insight’s most recent webinar.

With the rapid increase in the volume of ESG data that companies use, financial firms are having to make some far-reaching data management decisions, especially when it comes to mastering and cleaning.

Sourcing data from third-party vendors offers solutions to many of these shortcomings. But they also present their own challenges in the form of incomplete datasets and differing methodologies behind the scores and ratings they provide.

Data Shortage

To overcome these issues, data managers are putting in more work to create data that is rich and vibrant, and in optimal shape for use by the latest powerful analytical tools and software such as machine learning and natural language processing, Nirav Shah, head of ESG analytics at M&G told the “Approaches to ESG data for analytics” webinar.

Among the benefits that enrichment brings is the ability for analysts to trace the linkages between data and financial instruments. This not only offers insight into specific investments but also helps in the creation of models that can estimate instrument- and sector-level values where data isn’t available, Shah said.

Nevertheless, this is prey to a common issue that webinar participants agreed was probably the biggest stumbling block they face – a shortage of data.

Roshini Johri, head of analytics at HSBC, said that to create analytical models, the broadest datasets possible were needed. Because there is a dearth of information emanating from some parts of the world and from within some industrial sectors, the data record was skewed towards geographies and industries that did have good disclosure practices.

Consequently, some of the insight-driving models used by financial institutions are inherently biased towards the markets with best the reporting practices.

Shah agreed, arguing that despite the growing amount of data that companies are using, yet more is needed to fully enrich the datasets needed. Analysts are still a long way from having enough information to form a coherent view of the investments their companies have made or intend to make, he said.

Reliability Importance

Dataset comparability is another fundamental requirement for analytical operations that is being hobbled by data shortages, said Boyke Baboelal, ****strategic solutions director for the Americas at Alveo. To identify relative value in assts and companies, it’s important that analysts are using like-for-like data, but gaps in the data record and the widely differing scores and metrics offered by vendors and other sources makes that difficult.

Getting the data management piece right requires effective data mastering processes and the right technology, participants agreed. If the data is absorbed into the enterprise’s systems in the right shape, then it can be more easily slotted into analytical models and integrated with other, more structured, data.

The provisions HSBC has in place has helped the bank overcome many difficulties, said Johri. They include passing datasets through health checks before it’s fully ingested into the lender’s systems, automating data pipelines to ingest data at regular intervals and establishing a system of alerts for when anomalies are detected in the process.

It’s impossible to build models on data that’s unreliable, she said, making such data hygiene essential.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Effective due diligence, screening and monitoring to mitigate financial crime risk

Managing financial crime risk requires a comprehensive approach to due diligence, screening, and continuous monitoring. Financial institutions face increasing regulatory scrutiny and staying compliant in today’s dynamic environment requires advanced technologies. Failure to comply is resulting in severe enforcement penalties. This webinar will explore practical strategies and tools for mitigating financial crime risk, focusing on...

BLOG

Bigger is Better, Says Gresham CEO After Acquisition of S&P Global’s EDM Business

Gresham has finalised its acquisition of S&P Global’s EDM business as the data automation company expands to meet the growing and increasingly complex data needs of modern financial institutions. EDM, which supports more than US$12 trillion in assets, will sit alongside Gresham’s existing enterprise data management business, which was created with its merger with Alveo...

EVENT

Eagle Alpha Alternative Data Conference, London, hosted by A-Team Group

Now in its 8th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

The Data Management Implications of Solvency II

This special report accompanies a webinar we held on the popular topic of The Data Management Implications of Solvency II, discussing the data implications for asset managers and their custodians and asset servicers. You can register here to get immediate access to the Special Report.