About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Quality and Management Crucial to Effective ESG Analysis

Subscribe to our newsletter

Analytics are helping financial institutes mine crucial insights from ESG data that’s helping to guide their decision-making processes.

Analyses that can surface investment and hedging strategies, shape risk positioning and streamline regulatory compliance all work best when those datasets are properly managed. But ingesting information and making it useable remains a fundamental challenge, especially because ESG data arrives in firms’ systems in a variety of unstructured formats.

Overcoming the data quality and management issues will help ease any obstacles to analysing ESG data, experts told A-Team ESG Insight’s most recent webinar.

With the rapid increase in the volume of ESG data that companies use, financial firms are having to make some far-reaching data management decisions, especially when it comes to mastering and cleaning.

Sourcing data from third-party vendors offers solutions to many of these shortcomings. But they also present their own challenges in the form of incomplete datasets and differing methodologies behind the scores and ratings they provide.

Data Shortage

To overcome these issues, data managers are putting in more work to create data that is rich and vibrant, and in optimal shape for use by the latest powerful analytical tools and software such as machine learning and natural language processing, Nirav Shah, head of ESG analytics at M&G told the “Approaches to ESG data for analytics” webinar.

Among the benefits that enrichment brings is the ability for analysts to trace the linkages between data and financial instruments. This not only offers insight into specific investments but also helps in the creation of models that can estimate instrument- and sector-level values where data isn’t available, Shah said.

Nevertheless, this is prey to a common issue that webinar participants agreed was probably the biggest stumbling block they face – a shortage of data.

Roshini Johri, head of analytics at HSBC, said that to create analytical models, the broadest datasets possible were needed. Because there is a dearth of information emanating from some parts of the world and from within some industrial sectors, the data record was skewed towards geographies and industries that did have good disclosure practices.

Consequently, some of the insight-driving models used by financial institutions are inherently biased towards the markets with best the reporting practices.

Shah agreed, arguing that despite the growing amount of data that companies are using, yet more is needed to fully enrich the datasets needed. Analysts are still a long way from having enough information to form a coherent view of the investments their companies have made or intend to make, he said.

Reliability Importance

Dataset comparability is another fundamental requirement for analytical operations that is being hobbled by data shortages, said Boyke Baboelal, ****strategic solutions director for the Americas at Alveo. To identify relative value in assts and companies, it’s important that analysts are using like-for-like data, but gaps in the data record and the widely differing scores and metrics offered by vendors and other sources makes that difficult.

Getting the data management piece right requires effective data mastering processes and the right technology, participants agreed. If the data is absorbed into the enterprise’s systems in the right shape, then it can be more easily slotted into analytical models and integrated with other, more structured, data.

The provisions HSBC has in place has helped the bank overcome many difficulties, said Johri. They include passing datasets through health checks before it’s fully ingested into the lender’s systems, automating data pipelines to ingest data at regular intervals and establishing a system of alerts for when anomalies are detected in the process.

It’s impossible to build models on data that’s unreliable, she said, making such data hygiene essential.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: New data, old data, no data: How do you solve the data problems of MiFID II and FRTB?

Markets in Financial Instruments Directive II (MiFID II) is kick starting the generation of a broad range of new content sets that will power not only MiFID II, but also FRTB. The webinar will discuss emerging data sources and datasets, their potential benefits, and how they will be consumed. Listen to the webinar to find...

BLOG

Nature-Risk Data Proposals Hailed as Pathway to Better Investment Decisions

Proposals to improve the nature-risk data value chain has been welcomed by sustainability data leaders who said they will pave the way for better decision making and reporting by financial institutions and provide more detailed analyses for investors. The proposals offer a slate of principles to improve the quality of state-of-nature data collection and integration...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook – Fifth Edition

In response to the popularity of the A-Team Regulatory Data Handbook, we have published a fifth edition outlining the essentials of regulations that are likely to have an impact on data and data management at your organisation. New to this edition is a section on RegTech, covering drivers behind the development of innovative regulatory technology,...