Data management for compliance with Fundamental Review of the Trading Book (FRTB) regulation has been acknowledged as a significant challenge, but it is not without hope as banks within scope consider how much data they have in-house and what must be sourced externally, build data management platforms to meet FRTB requirements for data consistency across asset classes, and weigh up which of their trading desks will use the Standardised Approach (SA) or Internal Model Approach (IMA) to make capital calculations – and in some cases withdraw products that are too expensive to maintain in terms of capital requirements.
Beyond aiming for the January 2022 FRTB compliance deadline, banks should also seize the opportunity presented by the regulation to build broader data management platforms supporting not only FRTB, but also internal risk management and risk elements such as Value at Risk (VaR), says Eugene Stern, global risk product manager at Bloomberg. He adds: “This requires investment in data and analytics used for FRTB to be made available to other risk models.” Stern also notes that banks committed to the principles of regulation BCBS 239, including data consistency and risk data aggregation, are likely to be a step ahead on FRTB compliance.
Reviewing all the data and data management elements of FRTB with Stern, he notes two key challenges, classifying and sourcing data, and building a consistent data platform.
While standard data required for FRTB capital calculations includes market data and reference data, some banks are having difficulty with more specific data. Here, Stern cites the requirement of the SA to classify data into risk buckets defined by the BCBS in its final rules for FRTB published in January 2019. Data covering index derivatives under the SA can also be difficult to source where there is little transparency and fund managers managing the data are reluctant to share.
On the IMA front, data for the risk factor eligibility test (RFET) remains difficult to manage due to both the demand for time series data and observed prices of trades or committed quotes. The RFET, Stern suggests, is one of the most controversial elements of the IMA, even though the number of observed prices was reduced in the FRTB final rules.
He says: “In many cases, banks don’t have enough internal data to prove the risk factors they want to use in their models. They need to pool data with other banks, or source the data from vendors such as Bloomberg. Once price observations are sourced, they can be mapped to risk factors in the model.”
He suggests enthusiasm for pooling data is waning and that getting banks to share derivatives data is even more challenging and will lead to a fair number of non-modellable risk factors (NMRFs) for exotic derivatives. This could lead banks to stick with the IMA and manage NMRFs, drop back to the SA, or if the capital charges associated with these derivatives are very high pull out of trading particular derivatives altogether.
Looking at data platforms and the need for consistency across asset classes to derive calculations of curvature risk, Stern notes that operations running different calculations, such as risk sensitivities and greeks, separately won’t work anymore. To resolve the problem, banks are typically extending existing platforms or outsourcing their full data requirement or selecting particular components. Stern says: “This will support FRTB compliance, but a longer term strategic approach that overhauls all risk platforms and creates a common platform for internal and external risk management will be more cost effective and flexible.”
If a common risk platform, developed in-house or provided by a data vendor, is one of the benefits that could result from FRTB implementation, others identified during a recent A-Team webinar, which included Stern and other FRTB experts, included a better understanding of risk, increased competitive advantage, reduced capital requirements, and a potentially more profitable product portfolio.