About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Methodology Harmonisation may be no Panacea, Expert Warns

Subscribe to our newsletter

Volker Lainer raised eyebrows when he told the recent A-Team ESG Data and Tech Summit London that he saw a role for multiple methodologies used to calculate sustainability ratings and collate datasets.

The Head of Data Connections, ESG and Regulatory Affairs at enterprise data management firm GoldenSource even went as far as saying that a single guideline for data vendors would probably produce more greenwashing.

It’s a point of view that flies in the face of the general conversation around the challenges that financial institutions face in assessing ESG data. Critics of the ESG cause argue that the different ways in which data is presented to firms is a leading cause of confusion over what constitutes a green bond, for instance, or a sustainable investment strategy. And that’s an open invitation to overstate ESG performance, the argument continues.

In a conversation with ESG Insight after the May event, Lainer expanded on his line of thought, saying that the generally accepted view of harmonisation as a solution was flawed.

“What’s easier for somebody who intends to do greenwashing than have only one methodology to beat?” he asked. “Eliminating methodologists and standardising them is counterproductive.”

Greenwashing Threat

The basis of his argument is that different methodologies are necessary to accommodate the huge diversity of business activities and industries that need to be assessed. There can be no one-size-fits all approach.

“You cannot compare, deliberately, apples with oranges,” Lainer said. “Let’s not just force everything into one methodology, because then we’ll be prone to greenwashing all over the place.”

Lainer explained his point of view during conference panel, which discussed how firms could establish strong foundations for ESG data strategies. Joining him on the panel were Niresh Rajah, Managing Director, Head of Data, RegTech and Digital Assurance Practice at Grant Thornton; Rathi Ravi, ESG Data Specialist at Vanguard; Lorraine Waters, Chief Data Officer at Solidatus; Julia van Huizen, Sustainability Practice Lead at Amsterdam Data Collective; and, John Carroll, Head of Customer Success, Datactics.

The panellists discussed topics including how financial institutions could best prepare for their ESG data journeys as well as some best practices for sourcing ESG data and providing data lineage for better regulatory compliance.

Lainer said the debate was lively and thought-provoking. But he was surprised by panellists’ approach to data methodologies.

“There was the notion that we are still struggling with standardisation – that we’ll have to wait until this gets better or hope that it’s going to go away,” he said. “But it’s not going away.”

He offered two reasons for this. Firstly, by trying to force all vendors to prepare content in the same way, regulators and investors would miss out on a lot of information pertinent to specific companies and industries.

“You’d actually lose a lot of value that way,” he said.

And secondly, the frameworks that are already in existence are working well.

“There are many players out there, and many frameworks, and many standards – and they are there for a reason,” he said. “They have approaches to solve a particular area very well, that others don’t.”

Emissions Aggregation

Lainer also told ESG Insight how GoldenSource is building up its ESG Impact Plus product, adding new capabilities to help its financial institution clients get a better view of their portfolios’ sustainability records. The latest addition to the suite of tools is functionality that enables clients to aggregate the carbon emissions and calculate the carbon intensity of the assets within their portfolios.

This enables asset managers to model the carbon footprints of hypothetical portfolios and compare that with their actual portfolios.

“There has been very, very strong demand” for this functionality, Lainer explained. “It lets clients simulate what would happen if they moved funds to an investment alternative, and how that would affect their portfolios’ greenhouse gas footprint in its entirety.”

Another upgrade also offers a measure of the confidence level of the aggregated datasets, a metric that reflects how much of the data incorporated into the model has been derived from proxy calculations.

“Then you can start investigating the ones where you don’t have information,” he said. “If it’s because your data provider doesn’t have that information yet, are you still fine with it? Do you want to move them out and replace them with a vendor that does provide that information?”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Address Emerging Operational Risk and Alleviating Data Blind Spots with AI Powered Risk Management

The digitalisation of financial services is in full flight, as financial institutions strive to offer the same levels of service and improved customer experience that consumer markets have enjoyed for some time. This digitalisation – providing seamless access to appropriate services on demand – requires great emphasis on client data. This changing digital landscape, and...

BLOG

The Year in Data: 2025’s Biggest Trends and Developments

The past 12 months saw breakneck developments in how firms applied artificial intelligence. AI began to change from a mere tool to an integral part of capital markets operations. The year also saw data services providers launch multiple products for the growing private markets investment sector. Data Management Insight spoke to leaders in our industry...

EVENT

TradingTech Summit New York

Our TradingTech Summit in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Connecting to Today’s Fast Markets

At the same time, the growth of high frequency and event-driven trading techniques is spurring demand for direct feed services sourced from exchanges and other trading venues, including alternative trading systems and multilateral trading facilities. Handling these high-speed data feeds its presenting market data managers and their infrastructure teams with a challenge: how to manage...