About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

CESR Elaborates on New Counterparty Risk Requirements for UCITS, Data Quality Back in the Spotlight

Subscribe to our newsletter

The Committee of European Securities Regulators (CESR) has added yet another paper detailing new data requirements for UCITS as part of its ongoing mandate to provide more clarity around the requirements of the directive, this time focused on counterparty risk measurement. The “guidelines on risk measurement and the calculation of global exposure and counterparty risk for UCITS” (to give it its full name) incorporates principles that define how this data should be dealt with, including a set of quantitative and qualitative requirements around the calculation of relative and absolute value at risk (VaR).

The counterparty risk paper follows on from last month’s guidelines for the selection and presentation of data for UCITS’ new Key Investor Information (KII) documents. CESR is seeking to iron out any inconsistencies across European member states under UCITS by providing more prescriptive data and methodological requirements. This new paper is therefore an attempt to provide a sufficient level of clarity on counterparty risk measurement for UCITS in an increasingly cross border environment.

The calculation of counterparty risk for UCITS must be conducted on “at least a daily basis” and even, in some cases, on an intraday basis. This, in turn, places a much greater strain on counterparty data management systems that feed into the risk calculation engines in order to perform this function. In many institutions, this data is not held centrally and must be accessed from across a firms’ legacy siloed data infrastructure in a much timelier manner.

Rather than just looking at one aspect of risk data in isolation, CESR is also asking firms to assess the UCITS’ exposure to “all material risks including market risks, liquidity risks, counterparty risks and operational risks”. A much more holistic approach to the space, but one that asks a lot of data architectures for risk assessment that are almost all siloed according to risk type.

The risk calculations themselves are much more prescriptive and the regulator has provided a new level of detail about the assets that may be used as collateral and cover rules for transactions in financial derivative instruments, for example. This means that these new data checks must be put in place and the data workflow must be altered to take into account these new requirements. Moreover, CESR is looking for more transparency into these calculations and therefore the accompanying data must be much more granular, detailed and precise enough to stand up to new levels of scrutiny.

For example, for VaR calculations, a great deal of emphasis is being placed on the underlying data quality. The paper states: “The quantitative models used within the VaR framework (pricing tools, estimation of volatilities and correlations, etc) should provide for a high level of accuracy. All data used within the VaR framework should provide for consistency, timeliness and reliability.”

Much the same as other recent risk measurement requirements coming down from the European level, CESR recommends that firms pursue an active back testing and stress testing regime for these counterparty risk assessments. This places more emphasis on the storage and accessibility of historical data in order to support these new processes, which CESR recommends should be carried out on a monthly basis in the case of stress testing.

CESR also details the documentation requirements that should be incorporated into the new system in order to provide a full audit trail with regards to these risk measurement procedures. This includes holding detailed records about: the risks covered by the model; the model’s methodology; the mathematical assumptions and foundations; the data used; the accuracy and completeness of the risk assessment; the methods used to validate the model; the back testing process; the stress testing process; the validity range of the model; and the operational implementation. A whole new set of data items to be stored away by firms for easy access at the regulator’s request.

These level three guidelines, which are to accompany CESR’s level two implementing measures for UCITS, are therefore much more prescriptive in a data context than ever before. The regulator is asking firms to be able to produce historical data on demand and back up its calculation of counterparty risk for these instruments with much more granular data than previously required. However, it has thus far failed to assess the cost of these new requirements for financial institutions, which given the scale of the challenge, may be considerable for those who have not recently invested in updating their counterparty data management systems.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Best practices for buy-side data management across structured and unstructured data

Date: 14 November 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data management is central to asset management, but it can also be a challenge as firms face increased volumes of data, data complexity and the need to consolidate structured and unstructured data to gain valuable insights, improve decision-making, step...

BLOG

Alveo and Gresham Merge to Offer Data Services at ‘Significant’ Scale

Data management software and services providers Alveo and Gresham Technologies have merged in a deal that the newly augmented company says will offer clients data automation and optimisation at “significant” scale. The new business, which will be known as Gresham, will be based in London with former Gresham Technologies chief executive Ian Manocha continuing the...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Entity Data Management Handbook – Fourth Edition

Welcome to the fourth edition of A-Team Group’s Entity Data Management Handbook sponsored by entity data specialist Bureau van Dijk, a Moody’s Analytics company. As entity data takes a central role in business strategies dedicated to making the customer experience markedly better, this handbook delves into the detail of everything you need to do to...