A-Team Insight Blogs

A Strategic Approach to FRTB Compliance and Broader Risk Data Management

Share article

Data management for compliance with Fundamental Review of the Trading Book (FRTB) regulation has been acknowledged as a significant challenge, but it is not without hope as banks within scope consider how much data they have in-house and what must be sourced externally, build data management platforms to meet FRTB requirements for data consistency across asset classes, and weigh up which of their trading desks will use the Standardised Approach (SA) or Internal Model Approach (IMA) to make capital calculations – and in some cases withdraw products that are too expensive to maintain in terms of capital requirements.

Beyond aiming for the January 2022 FRTB compliance deadline, banks should also seize the opportunity presented by the regulation to build broader data management platforms supporting not only FRTB, but also internal risk management and risk elements such as Value at Risk (VaR), says Eugene Stern, global risk product manager at Bloomberg. He adds: “This requires investment in data and analytics used for FRTB to be made available to other risk models.” Stern also notes that banks committed to the principles of regulation BCBS 239, including data consistency and risk data aggregation, are likely to be a step ahead on FRTB compliance.

Reviewing all the data and data management elements of FRTB with Stern, he notes two key challenges, classifying and sourcing data, and building a consistent data platform.

Sourcing data

While standard data required for FRTB capital calculations includes market data and reference data, some banks are having difficulty with more specific data. Here, Stern cites the requirement of the SA to classify data into risk buckets defined by the BCBS in its final rules for FRTB published in January 2019. Data covering index derivatives under the SA can also be difficult to source where there is little transparency and fund managers managing the data are reluctant to share.

On the IMA front, data for the risk factor eligibility test (RFET) remains difficult to manage due to both the demand for time series data and observed prices of trades or committed quotes. The RFET, Stern suggests, is one of the most controversial elements of the IMA, even though the number of observed prices was reduced in the FRTB final rules.

He says: “In many cases, banks don’t have enough internal data to prove the risk factors they want to use in their models. They need to pool data with other banks, or source the data from vendors such as Bloomberg. Once price observations are sourced, they can be mapped to risk factors in the model.”

He suggests enthusiasm for pooling data is waning and that getting banks to share derivatives data is even more challenging and will lead to a fair number of non-modellable risk factors (NMRFs) for exotic derivatives. This could lead banks to stick with the IMA and manage NMRFs, drop back to the SA, or if the capital charges associated with these derivatives are very high pull out of trading particular derivatives altogether.

Data platforms

Looking at data platforms and the need for consistency across asset classes to derive calculations of curvature risk, Stern notes that operations running different calculations, such as risk sensitivities and greeks, separately won’t work anymore. To resolve the problem, banks are typically extending existing platforms or outsourcing their full data requirement or selecting particular components. Stern says: “This will support FRTB compliance, but a longer term strategic approach that overhauls all risk platforms and creates a common platform for internal and external risk management will be more cost effective and flexible.”

If a common risk platform, developed in-house or provided by a data vendor, is one of the benefits that could result from FRTB implementation, others identified during a recent A-Team webinar, which included Stern and other FRTB experts, included a better understanding of risk, increased competitive advantage, reduced capital requirements, and a potentially more profitable product portfolio.

Leave a comment

Your email address will not be published. Required fields are marked *

*

Related content

WEBINAR

Recorded Webinar: An update on data standards and global identifiers

The importance of data standards and global identifiers continues to be emphasised by both capital markets participants and their regulators. But how much progress is actually being made in developing and implementing standards that will improve data quality for better business decisions, provide regulators with a clearer view of systemic risk, and ensure more stable...

BLOG

FCA Highlights Key Concerns over MAR Compliance

Effective compliance with the Market Abuse Regime (MAR) is “a state of mind” that requires a series of complex situational judgements – and time pressures could cause potential problems, warned Julia Hoggett, Director of Market Oversight at the UK’s Financial Conduct Authority (FCA). This raises a number of key concerns around compliance which demand urgent...

EVENT

Data Management Summit London

Now in its 8th year, the Data Management Summit (DMS) in London explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.

GUIDE

Entity Data Management Handbook – Fifth Edition

Welcome to the fifth edition of A-Team Group’s Entity Data Management Handbook, sponsored for the fourth year running by entity data specialist Bureau van Dijk, a Moody’s Analytics Company. The past year has seen a crackdown on corporate responsibility for financial crime – with financial firms facing draconian fines for non-compliance and the very real...