About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

CESR Elaborates on New Counterparty Risk Requirements for UCITS, Data Quality Back in the Spotlight

Subscribe to our newsletter

The Committee of European Securities Regulators (CESR) has added yet another paper detailing new data requirements for UCITS as part of its ongoing mandate to provide more clarity around the requirements of the directive, this time focused on counterparty risk measurement. The “guidelines on risk measurement and the calculation of global exposure and counterparty risk for UCITS” (to give it its full name) incorporates principles that define how this data should be dealt with, including a set of quantitative and qualitative requirements around the calculation of relative and absolute value at risk (VaR).

The counterparty risk paper follows on from last month’s guidelines for the selection and presentation of data for UCITS’ new Key Investor Information (KII) documents. CESR is seeking to iron out any inconsistencies across European member states under UCITS by providing more prescriptive data and methodological requirements. This new paper is therefore an attempt to provide a sufficient level of clarity on counterparty risk measurement for UCITS in an increasingly cross border environment.

The calculation of counterparty risk for UCITS must be conducted on “at least a daily basis” and even, in some cases, on an intraday basis. This, in turn, places a much greater strain on counterparty data management systems that feed into the risk calculation engines in order to perform this function. In many institutions, this data is not held centrally and must be accessed from across a firms’ legacy siloed data infrastructure in a much timelier manner.

Rather than just looking at one aspect of risk data in isolation, CESR is also asking firms to assess the UCITS’ exposure to “all material risks including market risks, liquidity risks, counterparty risks and operational risks”. A much more holistic approach to the space, but one that asks a lot of data architectures for risk assessment that are almost all siloed according to risk type.

The risk calculations themselves are much more prescriptive and the regulator has provided a new level of detail about the assets that may be used as collateral and cover rules for transactions in financial derivative instruments, for example. This means that these new data checks must be put in place and the data workflow must be altered to take into account these new requirements. Moreover, CESR is looking for more transparency into these calculations and therefore the accompanying data must be much more granular, detailed and precise enough to stand up to new levels of scrutiny.

For example, for VaR calculations, a great deal of emphasis is being placed on the underlying data quality. The paper states: “The quantitative models used within the VaR framework (pricing tools, estimation of volatilities and correlations, etc) should provide for a high level of accuracy. All data used within the VaR framework should provide for consistency, timeliness and reliability.”

Much the same as other recent risk measurement requirements coming down from the European level, CESR recommends that firms pursue an active back testing and stress testing regime for these counterparty risk assessments. This places more emphasis on the storage and accessibility of historical data in order to support these new processes, which CESR recommends should be carried out on a monthly basis in the case of stress testing.

CESR also details the documentation requirements that should be incorporated into the new system in order to provide a full audit trail with regards to these risk measurement procedures. This includes holding detailed records about: the risks covered by the model; the model’s methodology; the mathematical assumptions and foundations; the data used; the accuracy and completeness of the risk assessment; the methods used to validate the model; the back testing process; the stress testing process; the validity range of the model; and the operational implementation. A whole new set of data items to be stored away by firms for easy access at the regulator’s request.

These level three guidelines, which are to accompany CESR’s level two implementing measures for UCITS, are therefore much more prescriptive in a data context than ever before. The regulator is asking firms to be able to produce historical data on demand and back up its calculation of counterparty risk for these instruments with much more granular data than previously required. However, it has thus far failed to assess the cost of these new requirements for financial institutions, which given the scale of the challenge, may be considerable for those who have not recently invested in updating their counterparty data management systems.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best Practices for Building High-Performance Data Infrastructures

The requirement for high-performance data systems to support trading analytics for hedge funds, high-frequency trading firms and electronic liquidity providers is well established. But the explosion in Big Data over the past several years has expanded the scope of inputs being used by these firms. At the same time, cloud technologies have added complexity to...

BLOG

The Eight Best Data Lineage Providers in 2022

Data lineage traces data from source to destination, noting every move the data makes and taking into account any changes to the data during its journey for full traceability. It is critical to regulatory compliance and data governance, and offers numerous business and operational benefits. The level of granularity and scope of the audit trail...

EVENT

RegTech Summit London

Now in its 6th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...