About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

CESR Elaborates on New Counterparty Risk Requirements for UCITS, Data Quality Back in the Spotlight

Subscribe to our newsletter

The Committee of European Securities Regulators (CESR) has added yet another paper detailing new data requirements for UCITS as part of its ongoing mandate to provide more clarity around the requirements of the directive, this time focused on counterparty risk measurement. The “guidelines on risk measurement and the calculation of global exposure and counterparty risk for UCITS” (to give it its full name) incorporates principles that define how this data should be dealt with, including a set of quantitative and qualitative requirements around the calculation of relative and absolute value at risk (VaR).

The counterparty risk paper follows on from last month’s guidelines for the selection and presentation of data for UCITS’ new Key Investor Information (KII) documents. CESR is seeking to iron out any inconsistencies across European member states under UCITS by providing more prescriptive data and methodological requirements. This new paper is therefore an attempt to provide a sufficient level of clarity on counterparty risk measurement for UCITS in an increasingly cross border environment.

The calculation of counterparty risk for UCITS must be conducted on “at least a daily basis” and even, in some cases, on an intraday basis. This, in turn, places a much greater strain on counterparty data management systems that feed into the risk calculation engines in order to perform this function. In many institutions, this data is not held centrally and must be accessed from across a firms’ legacy siloed data infrastructure in a much timelier manner.

Rather than just looking at one aspect of risk data in isolation, CESR is also asking firms to assess the UCITS’ exposure to “all material risks including market risks, liquidity risks, counterparty risks and operational risks”. A much more holistic approach to the space, but one that asks a lot of data architectures for risk assessment that are almost all siloed according to risk type.

The risk calculations themselves are much more prescriptive and the regulator has provided a new level of detail about the assets that may be used as collateral and cover rules for transactions in financial derivative instruments, for example. This means that these new data checks must be put in place and the data workflow must be altered to take into account these new requirements. Moreover, CESR is looking for more transparency into these calculations and therefore the accompanying data must be much more granular, detailed and precise enough to stand up to new levels of scrutiny.

For example, for VaR calculations, a great deal of emphasis is being placed on the underlying data quality. The paper states: “The quantitative models used within the VaR framework (pricing tools, estimation of volatilities and correlations, etc) should provide for a high level of accuracy. All data used within the VaR framework should provide for consistency, timeliness and reliability.”

Much the same as other recent risk measurement requirements coming down from the European level, CESR recommends that firms pursue an active back testing and stress testing regime for these counterparty risk assessments. This places more emphasis on the storage and accessibility of historical data in order to support these new processes, which CESR recommends should be carried out on a monthly basis in the case of stress testing.

CESR also details the documentation requirements that should be incorporated into the new system in order to provide a full audit trail with regards to these risk measurement procedures. This includes holding detailed records about: the risks covered by the model; the model’s methodology; the mathematical assumptions and foundations; the data used; the accuracy and completeness of the risk assessment; the methods used to validate the model; the back testing process; the stress testing process; the validity range of the model; and the operational implementation. A whole new set of data items to be stored away by firms for easy access at the regulator’s request.

These level three guidelines, which are to accompany CESR’s level two implementing measures for UCITS, are therefore much more prescriptive in a data context than ever before. The regulator is asking firms to be able to produce historical data on demand and back up its calculation of counterparty risk for these instruments with much more granular data than previously required. However, it has thus far failed to assess the cost of these new requirements for financial institutions, which given the scale of the challenge, may be considerable for those who have not recently invested in updating their counterparty data management systems.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Streamlining trading and investment processes with data standards and identifiers

3 June 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Financial institutions are integrating not only greater volumes of data for use across their organisation but also more varieties of data. As well, that data is being applied to more use cases than ever before, especially regulatory compliance and ESG integration....

BLOG

Bloomberg Helps Investors Overcome Syndicated Loans Data Visibility Challenge

From the outside, syndicated loans can look like a data black hole. Capital goes into them, but very little information on the performance of those investments comes out. That may be about to change. As investors diversify their portfolios to hedge against volatility amid tense markets, asset classes once considered too exotic for generalists have...

EVENT

TradingTech Summit London

Now in its 14th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...