About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A Dive Into the Detail of Sourcing and Managing Data for FRTB

Subscribe to our newsletter

There is no silver bullet for compliance with the Fundamental Review of the Trading Book (FRTB). The data sourcing and management requirements of the regulation are among the most difficult that capital markets participants have ever faced, the cost of implementation can be crippling, and the January 2022 compliance deadline has been finalised.

So, how best can banks approach the data management challenges of the regulation, decide which trading desks should run the regulation’s Internal Model Approach (IMA) or Standardised Approach (SA) to calculate market risk capital requirements, and will any good come from compliance?

A recent A-Team Group webinar discussed these issues and more, initially noting that the time to start work on FRTB compliance is now, if you haven’t started already, and that many firms need to pick up the pace of understanding and sourcing data that has never previously been required, such as extensive historical data and observable pricing points to meet the Risk Factor Eligibility Test (FRET) and Non-Modellable Risk Factor (NMRF) elements of the regulation.

Looking at whether to run trading desks under IMA or SA, an early poll of the webinar audience asked what considerations banks take into account when deciding whether or not to run the IMA. Some 67% said cost of implementation, 61% complexity of implementation, 44% resulting capital charges, 33% sourcing data required solely for the IMA, and a further 33% real-time reporting of intra-day risk.

Considering the challenges raised by selecting the IMA, webinar speaker Bradley Foster, global head of content at Bloomberg, said: “One of the biggest challenges is making sure all required data is brought together, normalised and stored in a data warehouse or data lake. The need then is to understand whether you have the right data, particularly pricing data, and work with external data providers to source any additional data that is needed.” Satinder Jandu, director at Viewset, added a comment on the increased granularity of data required for the IMA.

The SA is less demanding than the IMA, but does come with its own problems, again sourcing data such as pricing data for risk sensitivities, managing large volumes of data (as with the IMA), and classifying data into risk buckets defined by the BCBS in its final rules for FRTB published in January 2019.

Best practice approaches to solving the problems presented by the SA include taking an holistic view of data, simplifying processes where possible and leveraging existing work around the European Central Bank’s Targeted Review of Internal Models (TRIM) and BCBS 239. Jandu said: “You must take a strategic approach, the time for working regulation by regulation is over. Leverage existing work to fix problems, or if you have money, build a Rolls Royce cloud-based solution.”

With problems ironed out, the potential gains of FRTB compliance can be significant. At base level they include the results of a systems review required to implement FRTB. Moving up the value chain, firms can gain greater transparency of risk, create more detailed analytical views, and better manage, or reduce, capital requirements.

Concluding the webinar with some guidance for practitioners working on FRTB, Jerry Goddard, former director of traded risk at Santander UK, echoed Jandu, saying: “Don’t do this in isolation to other regulations. Focus on data that is difficult to source, make decisions on key desks and move forward.” Bradley added: “The time is now to get your data right, achieve data consistency across the front, middle and back office, and find a vendor to help with data on the FRTB journey.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: GenAI and LLM case studies for Surveillance, Screening and Scanning

As Generative AI (GenAI) and Large Language Models (LLMs) move from pilot to production, compliance, surveillance, and screening functions are seeing tangible results – and new risks. From trade surveillance to adverse media screening to policy and regulatory scanning, GenAI and LLMs promise to tackle complexity and volume at a scale never seen before. But...

BLOG

Introducing RegPass: A New Agentic Paradigm for Regulatory Change Management

After more than a decade shaped by document aggregation, workflow portals, and rule-mapping engines, a third generation of regulatory intelligence platforms is beginning to emerge. These systems move beyond collecting and classifying regulatory updates. Instead, they attempt something more ambitious: to understand, model and reason about a firm’s actual business operations, and to connect regulatory...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

BCBS 239 Data Management Handbook

Our 2015/2016 edition of the BCBS 239 Data Management Handbook has arrived! Printed copies went like hotcakes at our Data Management Summit in New York but you can download your own copy here and get access to detailed information on the  principles and implications of BCBS 239 on Data Management. This Handbook provides an at-a-glance...