About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

EU Capital Requirements Directive Update Specifies Minimum Data Standards for Risk Modelling

Subscribe to our newsletter

The European Parliament has finally published amendments to the Capital Requirements Directive in the Official Journal of the European Union, including an update to the internal risk modelling requirements that specifies that “minimum data standards” must be met. Highlighting the global regulatory focus on improving data quality, the update indicates that supporting data for internal risk models will be scrutinised by regulators in order to ensure that these are as accurate as possible.

The amendment states: “The institution’s internal model shall conservatively assess the risk arising from less liquid positions and positions with limited price transparency under realistic market scenarios. In addition, the internal model shall meet minimum data standards. Proxies shall be appropriately conservative and may be used only where available data is insufficient or is not reflective of the true volatility of a position or portfolio.”

Much like the supporting data currently required around evaluated prices in order to prove the soundness of the models used, this requirement for sound data is being extended into the wider area of risk modelling. Accordingly, there are a number of references throughout the regulatory paper to the need for “objective and up to date” data.

The introduction of regular stress testing in particular will force firms to invest in the data architectures supporting their risk function. The CRD revisions introduce a new “stressed value at risk” calculation that is to be based on the “10 day, 99th percentile, one-tailed confidence interval value at risk measure of the current portfolio, with value at risk model inputs calibrated to historical data from a continuous 12 month period of significant financial stress relevant to the institution’s portfolio”.

Hence reference data will need to be well integrated with the real-time risk modelling environment, putting pressure on the data workflows currently in place, given that these calculations must be done “at least weekly”. Moreover, the CRD update indicates that the choice of such historical data shall be subject to approval by the competent authorities and to annual review by the institution.

“The Committee of European Banking Supervisors shall monitor the range of practices in this area and draw up guidelines in order to ensure convergence,” it states. In order to meet these requirements firms will need to ensure that their data is accurate and readily available for reporting purposes. Firms such as Royal Bank of Scotland (RBS) are therefore already attempting to better support their risk functions in light of this barrage of new requirements.

Mark Davies, head of reference data for SSF Risk Services at RBS, explained at a recent event that its One Risk project is well underway: “A consistent view of data across the organisation is the goal of our current project: that the different silos are sharing the same data from a product and function point of view.”

One can expect that many more of these projects will be launched in the coming months, with vendors also readying their capabilities to capitalise on the tight deadlines involved and offer solutions off the shelf to meet specific risk data requirements. You can download the EU document here.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to simplify and modernize data architecture to unleash data value and innovation

The data needs of financial institutions are growing at pace as new formats and greater volumes of information are integrated into their systems. With this has come greater complexity in managing and governing that data, amplifying pain points along data pipelines. In response, innovative new streamlined and flexible architectures have emerged that can absorb and...

BLOG

GoldenSource OMNI Evolves as Buy-Side Demands Transform

Data cloud giant Snowflake’s forum in San Francisco last month was closely watched by the data management industry, especially GoldenSource. A year after its launch, the creators of GoldenSource’s OMNI data lake product for asset managers were keenly watching what Snowflake had to offer with an eye to enhancing the app’s own provisions for the...

EVENT

TradingTech Summit New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...