About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Methodology Harmonisation may be no Panacea, Expert Warns

Subscribe to our newsletter

Volker Lainer raised eyebrows when he told the recent A-Team ESG Data and Tech Summit London that he saw a role for multiple methodologies used to calculate sustainability ratings and collate datasets.

The Head of Data Connections, ESG and Regulatory Affairs at enterprise data management firm GoldenSource even went as far as saying that a single guideline for data vendors would probably produce more greenwashing.

It’s a point of view that flies in the face of the general conversation around the challenges that financial institutions face in assessing ESG data. Critics of the ESG cause argue that the different ways in which data is presented to firms is a leading cause of confusion over what constitutes a green bond, for instance, or a sustainable investment strategy. And that’s an open invitation to overstate ESG performance, the argument continues.

In a conversation with ESG Insight after the May event, Lainer expanded on his line of thought, saying that the generally accepted view of harmonisation as a solution was flawed.

“What’s easier for somebody who intends to do greenwashing than have only one methodology to beat?” he asked. “Eliminating methodologists and standardising them is counterproductive.”

Greenwashing Threat

The basis of his argument is that different methodologies are necessary to accommodate the huge diversity of business activities and industries that need to be assessed. There can be no one-size-fits all approach.

“You cannot compare, deliberately, apples with oranges,” Lainer said. “Let’s not just force everything into one methodology, because then we’ll be prone to greenwashing all over the place.”

Lainer explained his point of view during conference panel, which discussed how firms could establish strong foundations for ESG data strategies. Joining him on the panel were Niresh Rajah, Managing Director, Head of Data, RegTech and Digital Assurance Practice at Grant Thornton; Rathi Ravi, ESG Data Specialist at Vanguard; Lorraine Waters, Chief Data Officer at Solidatus; Julia van Huizen, Sustainability Practice Lead at Amsterdam Data Collective; and, John Carroll, Head of Customer Success, Datactics.

The panellists discussed topics including how financial institutions could best prepare for their ESG data journeys as well as some best practices for sourcing ESG data and providing data lineage for better regulatory compliance.

Lainer said the debate was lively and thought-provoking. But he was surprised by panellists’ approach to data methodologies.

“There was the notion that we are still struggling with standardisation – that we’ll have to wait until this gets better or hope that it’s going to go away,” he said. “But it’s not going away.”

He offered two reasons for this. Firstly, by trying to force all vendors to prepare content in the same way, regulators and investors would miss out on a lot of information pertinent to specific companies and industries.

“You’d actually lose a lot of value that way,” he said.

And secondly, the frameworks that are already in existence are working well.

“There are many players out there, and many frameworks, and many standards – and they are there for a reason,” he said. “They have approaches to solve a particular area very well, that others don’t.”

Emissions Aggregation

Lainer also told ESG Insight how GoldenSource is building up its ESG Impact Plus product, adding new capabilities to help its financial institution clients get a better view of their portfolios’ sustainability records. The latest addition to the suite of tools is functionality that enables clients to aggregate the carbon emissions and calculate the carbon intensity of the assets within their portfolios.

This enables asset managers to model the carbon footprints of hypothetical portfolios and compare that with their actual portfolios.

“There has been very, very strong demand” for this functionality, Lainer explained. “It lets clients simulate what would happen if they moved funds to an investment alternative, and how that would affect their portfolios’ greenhouse gas footprint in its entirety.”

Another upgrade also offers a measure of the confidence level of the aggregated datasets, a metric that reflects how much of the data incorporated into the model has been derived from proxy calculations.

“Then you can start investigating the ones where you don’t have information,” he said. “If it’s because your data provider doesn’t have that information yet, are you still fine with it? Do you want to move them out and replace them with a vendor that does provide that information?”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Strategies and solutions for unlocking value from unstructured data

Unstructured data accounts for a growing proportion of the information that capital markets participants are using in their day-to-day operations. Technology – especially generative artificial intelligence (GenAI) – is enabling organisations to prise crucial insights from sources – such as social media posts, news articles and sustainability and company reports – that were all but...

BLOG

2024: A Year of Increasing Data Complexity

One thing was apparent in the data management space over the past year; the job of chief data officers became increasingly complex as the volume of data their organisations ingested swelled and the uses to which it was put expanded. All that despite the flowering of technologies with the potential to make life easier for...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

FRTB Special Report

FRTB is one of the most sweeping and transformative pieces of regulation to hit the financial markets in the last two decades. With the deadline confirmed as January 2022, this Special Report provides a detailed insight into exactly what the data requirements are for FRTB in its latest (and final) incarnation, and explores what needs to be done in order to meet these needs on a cost-effective and company-wide basis.