About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A Strategic Approach to FRTB Compliance and Broader Risk Data Management

Subscribe to our newsletter

Data management for compliance with Fundamental Review of the Trading Book (FRTB) regulation has been acknowledged as a significant challenge, but it is not without hope as banks within scope consider how much data they have in-house and what must be sourced externally, build data management platforms to meet FRTB requirements for data consistency across asset classes, and weigh up which of their trading desks will use the Standardised Approach (SA) or Internal Model Approach (IMA) to make capital calculations – and in some cases withdraw products that are too expensive to maintain in terms of capital requirements.

Beyond aiming for the January 2022 FRTB compliance deadline, banks should also seize the opportunity presented by the regulation to build broader data management platforms supporting not only FRTB, but also internal risk management and risk elements such as Value at Risk (VaR), says Eugene Stern, global risk product manager at Bloomberg. He adds: “This requires investment in data and analytics used for FRTB to be made available to other risk models.” Stern also notes that banks committed to the principles of regulation BCBS 239, including data consistency and risk data aggregation, are likely to be a step ahead on FRTB compliance.

Reviewing all the data and data management elements of FRTB with Stern, he notes two key challenges, classifying and sourcing data, and building a consistent data platform.

Sourcing data

While standard data required for FRTB capital calculations includes market data and reference data, some banks are having difficulty with more specific data. Here, Stern cites the requirement of the SA to classify data into risk buckets defined by the BCBS in its final rules for FRTB published in January 2019. Data covering index derivatives under the SA can also be difficult to source where there is little transparency and fund managers managing the data are reluctant to share.

On the IMA front, data for the risk factor eligibility test (RFET) remains difficult to manage due to both the demand for time series data and observed prices of trades or committed quotes. The RFET, Stern suggests, is one of the most controversial elements of the IMA, even though the number of observed prices was reduced in the FRTB final rules.

He says: “In many cases, banks don’t have enough internal data to prove the risk factors they want to use in their models. They need to pool data with other banks, or source the data from vendors such as Bloomberg. Once price observations are sourced, they can be mapped to risk factors in the model.”

He suggests enthusiasm for pooling data is waning and that getting banks to share derivatives data is even more challenging and will lead to a fair number of non-modellable risk factors (NMRFs) for exotic derivatives. This could lead banks to stick with the IMA and manage NMRFs, drop back to the SA, or if the capital charges associated with these derivatives are very high pull out of trading particular derivatives altogether.

Data platforms

Looking at data platforms and the need for consistency across asset classes to derive calculations of curvature risk, Stern notes that operations running different calculations, such as risk sensitivities and greeks, separately won’t work anymore. To resolve the problem, banks are typically extending existing platforms or outsourcing their full data requirement or selecting particular components. Stern says: “This will support FRTB compliance, but a longer term strategic approach that overhauls all risk platforms and creates a common platform for internal and external risk management will be more cost effective and flexible.”

If a common risk platform, developed in-house or provided by a data vendor, is one of the benefits that could result from FRTB implementation, others identified during a recent A-Team webinar, which included Stern and other FRTB experts, included a better understanding of risk, increased competitive advantage, reduced capital requirements, and a potentially more profitable product portfolio.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to leverage Generative AI and Large Language Models for regulatory compliance

Generative AI (GenAI) and Large Language Models (LLMs) offer huge potential for change across capital markets, not least in regulatory compliance where they have the capability to help firms understand and interpret regulations, automate compliance, monitor transactions in real time, and flag anomalies in the same timeframe. They also present challenges including explainability, responsibility, model...

BLOG

The Challenge of Data Integration in a Multiple Data Source World

By Inesa Smigola, Head of Presales, EMEA and APAC at Xceptor. Financial institutions have a growing data challenge – ever increasing data volumes, much of it unstructured, multiple data sources, and hugely varied data formats and structures. Across this is the additional challenge of inconsistent data quality according to data source and format– an Excel...

EVENT

Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

BCBS 239 Data Management Handbook

Our 2015/2016 edition of the BCBS 239 Data Management Handbook has arrived! Printed copies went like hotcakes at our Data Management Summit in New York but you can download your own copy here and get access to detailed information on the  principles and implications of BCBS 239 on Data Management. This Handbook provides an at-a-glance...