The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

CESR Elaborates on New Counterparty Risk Requirements for UCITS, Data Quality Back in the Spotlight

Share article

The Committee of European Securities Regulators (CESR) has added yet another paper detailing new data requirements for UCITS as part of its ongoing mandate to provide more clarity around the requirements of the directive, this time focused on counterparty risk measurement. The “guidelines on risk measurement and the calculation of global exposure and counterparty risk for UCITS” (to give it its full name) incorporates principles that define how this data should be dealt with, including a set of quantitative and qualitative requirements around the calculation of relative and absolute value at risk (VaR).

The counterparty risk paper follows on from last month’s guidelines for the selection and presentation of data for UCITS’ new Key Investor Information (KII) documents. CESR is seeking to iron out any inconsistencies across European member states under UCITS by providing more prescriptive data and methodological requirements. This new paper is therefore an attempt to provide a sufficient level of clarity on counterparty risk measurement for UCITS in an increasingly cross border environment.

The calculation of counterparty risk for UCITS must be conducted on “at least a daily basis” and even, in some cases, on an intraday basis. This, in turn, places a much greater strain on counterparty data management systems that feed into the risk calculation engines in order to perform this function. In many institutions, this data is not held centrally and must be accessed from across a firms’ legacy siloed data infrastructure in a much timelier manner.

Rather than just looking at one aspect of risk data in isolation, CESR is also asking firms to assess the UCITS’ exposure to “all material risks including market risks, liquidity risks, counterparty risks and operational risks”. A much more holistic approach to the space, but one that asks a lot of data architectures for risk assessment that are almost all siloed according to risk type.

The risk calculations themselves are much more prescriptive and the regulator has provided a new level of detail about the assets that may be used as collateral and cover rules for transactions in financial derivative instruments, for example. This means that these new data checks must be put in place and the data workflow must be altered to take into account these new requirements. Moreover, CESR is looking for more transparency into these calculations and therefore the accompanying data must be much more granular, detailed and precise enough to stand up to new levels of scrutiny.

For example, for VaR calculations, a great deal of emphasis is being placed on the underlying data quality. The paper states: “The quantitative models used within the VaR framework (pricing tools, estimation of volatilities and correlations, etc) should provide for a high level of accuracy. All data used within the VaR framework should provide for consistency, timeliness and reliability.”

Much the same as other recent risk measurement requirements coming down from the European level, CESR recommends that firms pursue an active back testing and stress testing regime for these counterparty risk assessments. This places more emphasis on the storage and accessibility of historical data in order to support these new processes, which CESR recommends should be carried out on a monthly basis in the case of stress testing.

CESR also details the documentation requirements that should be incorporated into the new system in order to provide a full audit trail with regards to these risk measurement procedures. This includes holding detailed records about: the risks covered by the model; the model’s methodology; the mathematical assumptions and foundations; the data used; the accuracy and completeness of the risk assessment; the methods used to validate the model; the back testing process; the stress testing process; the validity range of the model; and the operational implementation. A whole new set of data items to be stored away by firms for easy access at the regulator’s request.

These level three guidelines, which are to accompany CESR’s level two implementing measures for UCITS, are therefore much more prescriptive in a data context than ever before. The regulator is asking firms to be able to produce historical data on demand and back up its calculation of counterparty risk for these instruments with much more granular data than previously required. However, it has thus far failed to assess the cost of these new requirements for financial institutions, which given the scale of the challenge, may be considerable for those who have not recently invested in updating their counterparty data management systems.

Related content

WEBINAR

Recorded Webinar: Adopting Entity Data Hierarchies to Address Holistic Risk Management

Firms across the board are struggling to gain a comprehensive view of their counterparty risk. In the wake of the Credit Crisis, regulators have increased their focus on pushing firms to not only better understand risk exposure, but also be able to provide evidence of the analysis they use to create their view of risk....

BLOG

New Managed Data Service from Bloomberg Opens up Enterprise-Wide Access

Bloomberg has launched a new managed service, Data License Plus (DL+), to aggregate a client’s Bloomberg data into a single dataset, enabling users to explore and interact with it through a web-based user interface. The new service aims to make DL data easier to access throughout the enterprise, while reducing the need for data processing....

EVENT

RegTech Summit Virtual

Regtech Summit Virtual will explore how business and operating models have adapted post COVID and how RegTech can provide agile and enhanced compliance for managing an evolving risk and compliance landscape. As the dust settles, we will look at the outlook for the global RegTech industry, where Regulators are focusing as they get back to business, and deep dive into global regulatory priorities for the rest of the year and into 2021.

GUIDE

Regulatory Data Handbook – Fifth Edition

In response to the popularity of the A-Team Regulatory Data Handbook, we have published a fifth edition outlining the essentials of regulations that are likely to have an impact on data and data management at your organisation. New to this edition is a section on RegTech, covering drivers behind the development of innovative regulatory technology,...