The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

A Dive Into the Detail of Sourcing and Managing Data for FRTB

There is no silver bullet for compliance with the Fundamental Review of the Trading Book (FRTB). The data sourcing and management requirements of the regulation are among the most difficult that capital markets participants have ever faced, the cost of implementation can be crippling, and the January 2022 compliance deadline has been finalised.

So, how best can banks approach the data management challenges of the regulation, decide which trading desks should run the regulation’s Internal Model Approach (IMA) or Standardised Approach (SA) to calculate market risk capital requirements, and will any good come from compliance?

A recent A-Team Group webinar discussed these issues and more, initially noting that the time to start work on FRTB compliance is now, if you haven’t started already, and that many firms need to pick up the pace of understanding and sourcing data that has never previously been required, such as extensive historical data and observable pricing points to meet the Risk Factor Eligibility Test (FRET) and Non-Modellable Risk Factor (NMRF) elements of the regulation.

Looking at whether to run trading desks under IMA or SA, an early poll of the webinar audience asked what considerations banks take into account when deciding whether or not to run the IMA. Some 67% said cost of implementation, 61% complexity of implementation, 44% resulting capital charges, 33% sourcing data required solely for the IMA, and a further 33% real-time reporting of intra-day risk.

Considering the challenges raised by selecting the IMA, webinar speaker Bradley Foster, global head of content at Bloomberg, said: “One of the biggest challenges is making sure all required data is brought together, normalised and stored in a data warehouse or data lake. The need then is to understand whether you have the right data, particularly pricing data, and work with external data providers to source any additional data that is needed.” Satinder Jandu, director at Viewset, added a comment on the increased granularity of data required for the IMA.

The SA is less demanding than the IMA, but does come with its own problems, again sourcing data such as pricing data for risk sensitivities, managing large volumes of data (as with the IMA), and classifying data into risk buckets defined by the BCBS in its final rules for FRTB published in January 2019.

Best practice approaches to solving the problems presented by the SA include taking an holistic view of data, simplifying processes where possible and leveraging existing work around the European Central Bank’s Targeted Review of Internal Models (TRIM) and BCBS 239. Jandu said: “You must take a strategic approach, the time for working regulation by regulation is over. Leverage existing work to fix problems, or if you have money, build a Rolls Royce cloud-based solution.”

With problems ironed out, the potential gains of FRTB compliance can be significant. At base level they include the results of a systems review required to implement FRTB. Moving up the value chain, firms can gain greater transparency of risk, create more detailed analytical views, and better manage, or reduce, capital requirements.

Concluding the webinar with some guidance for practitioners working on FRTB, Jerry Goddard, former director of traded risk at Santander UK, echoed Jandu, saying: “Don’t do this in isolation to other regulations. Focus on data that is difficult to source, make decisions on key desks and move forward.” Bradley added: “The time is now to get your data right, achieve data consistency across the front, middle and back office, and find a vendor to help with data on the FRTB journey.”

Related content


Upcoming Webinar: Best practices for metadata management

Date: 5 October 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Metadata has become central to financial firms as a means of enriching, discovering and effectively using both business and technical data. Its management is equally important, particularly in capital markets applications such as data lineage and data governance, which...


SmartStream Adds Microsoft Azure Cloud Hosting for Air Data Reconciliation Solution

SmartStream Technologies has extended the reach of its Air cloud and AI based data reconciliation solution with version 3, which is hosted on the Microsoft Azure cloud required by some global regions to support all their data requirements. SmartStream Air was originally hosted on the Amazon Web Services (AWS) cloud and is now available on...


Virtual Briefing: ESG Data Management – A Strategic Imperative

This briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.


Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...