About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Model Behaviour: Testing the Tools that Underpin ESG Decisions

Subscribe to our newsletter

The quality of data and technology used in ESG systems is of growing importance to investors and regulators as they seek to do more with the information available, to comply with regulations and to arm themselves against accusations of greenwashing.

Last week we discussed the emerging body of work that’s bringing accountancy-level assurance to ESG data, providing banks and other institutions with independent validation of the veracity, applicability and quality of the information they use and report. This week we turn to the interrogation of models into which that data is fed and on which institutions make their decisions.

India-headquartered ratings and analytics firm CRISIL has expanded its operations worldwide to also include a variety of other critical financial services. One of the most prominent is model validation, testing whether financial models that banks use in their various activities are sound.

One of the fastest-growing pieces of CRISIL’s validation business is that focused on climate risk modelling, the creation of tools to estimate the likely impact of phenomena associated with a rapidly warming world – such as wildfires and floods – on the assets and activities of loan-seeking borrowers. They are particularly important for when regulators and central banks require lenders to stress test the resilience of their portfolios to climate risks but they are also central commercial and retail lending functions.

Banks had commonly used third-party models and black box testing services. But as the demand for, and sophistication of, modelling techniques has increased banks have begun building their own tools. And those need to be validated.

“The outputs of models are used for internal forecasting and decision making, so it’s important that there is reliability in the numbers,” Anshuman Prasad, risk and analytics leader at CRISIL, told ESG Insight. “There are so many challenges associated with modelling that it makes sense to have them validated.”

Data Challenges

While some of those challenges are common within the ESG space – linked to the quality and availability of data – others are peculiar to the modelling profession.

On the data front, banks rely mostly on information provided by third parties because the expertise needed to process such large volumes of data is usually beyond their skillsets. However, even for specialist vendors, the shortage of suitable data means they supplement it with estimates and proxies derived from their own models.

CRISIL must validate not only the raw data but also that derived data.

“We need to check what methodologies have been followed for getting this data and the data sourcing,” said Prasad. “Where are they getting this data from and is the quality of the data consistent? These are important for models.”

Validation of the models themselves requires equally forensic examination. Most are based on academic literature, which must be analysed to ensure the “soundness of the theories” has been translated properly and built accurately into the model.

Nitty Gritty

Validation, which can take between six weeks and anywhere up to four months, also considers “the nitty gritty” that goes into the calculations on which the models are based and, again, examine if they are consistent with the academic papers. Sensitivity analysis, or stress tests, probes the impacts of the assumptions behind the models. For instance, a small change to a model could have a substantial impact on a simulation that’s built out to a long time horizon.

The type of scenario that a model seeks to replicate also requires different mathematical approaches that must be tested. Physical risk models, which assess threats to assets from the phenomena of climate change, rely on different data and are differently applied than to those that calculate the risk from carbon transition processes.

“Transition risk models are also very complicated because they measure the impacts of regulations, the responses to regulations and the impacts of change to renewable energy sources,” said Prasad. “The assumptions that have to be made and the narratives that are created mean there’s a lot more reliance on expert judgement.”

Getting Started

Modelling has become essential not only from a business perspective but also from a regulatory one. Overseers are requiring the validation of data and models as part of their ESG due diligence rules, forcing financial institutions to open their tools to third-party scrutiny and approval.

The threat of regulatory censure is encouraging more institutions to take up the validation services of companies like CRISIL. Prasad said that more companies are creating more models and that the work his highly trained data scientists, risk modellers and mathematicians have absorbed in recent years has snowballed. That has been hastened, also, by the rapidity with which regulations have been created or updated.

Prasad said that other financial models have become relatively easy to validate because over time a level of standardisation has emerged that lends itself to the automation of testing. This isn’t the case for climate risk modelling, which still presents a great variability in the simulations produced, requiring more hands-on expert testing.

“People are just right now starting to validate these models, so it’s going to take time for standards to evolve,” he said.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The Data Management Implications of BCBS 239

We plan to put the spotlight on this much overlooked but important regulation which has wide-ranging implications for risk data aggregation and reporting, data governance and data architecture. During this webinar we discuss: What is needed from a data management point of view to comply with BCBS 239 A view of progress across the industry...

BLOG

The Year in Data: 2025’s Biggest Trends and Developments

The past 12 months saw breakneck developments in how firms applied artificial intelligence. AI began to change from a mere tool to an integral part of capital markets operations. The year also saw data services providers launch multiple products for the growing private markets investment sector. Data Management Insight spoke to leaders in our industry...

EVENT

ExchangeTech Summit London

A-Team Group, organisers of the TradingTech Summits, are pleased to announce the inaugural ExchangeTech Summit London on May 14th 2026. This dedicated forum brings together operators of exchanges, alternative execution venues and digital asset platforms with the ecosystem of vendors driving the future of matching engines, surveillance and market access.

GUIDE

Data Lineage Handbook

Data lineage has become a critical concern for data managers in capital markets as it is key to both regulatory compliance and business opportunity. The regulatory requirement for data lineage kicked in with BCBS 239 in 2016 and has since been extended to many other regulations that oblige firms to provide transparency and a data...