A-Team Insight Blogs

How to Meet the Data Rules and Requirements of Fundamental Review of the Trading Book Regulation

Share article

Implementation of Fundamental Review of the Trading Book (FRTB) regulation is complex, demanding and time consuming, but can be achieved by setting priorities, working with external data vendors, and taking best practice approaches to the data management challenges presented by the regulation.

A well-attended A-Team Group briefing hosted at the Glaziers Hall in London, Meeting the Data Requirements of FRTB, discussed these issues, set out priorities and timelines, offered solutions and provided a forum to answer financial institutions’ outstanding questions on the regulation.

An opening poll asking audience members where they are in terms of meeting FRTB compliance, showed 46% making good progress, 38% having made a start, 8% in the planning stage, and 8% not yet started.

Bearing these results in mind, Neels Vosloo, head of regulatory risk EMEA at Bank of America Merrill Lynch, presented a comprehensive keynote covering the rules and requirements of FRTB and the priorities to meet them. He highlighted the progression of FRTB from an initial version in 2010, to publication of revised standards for minimum capital requirements for market risk in 2016, a consultation on these standards in 2018, and final rules based on the consultation announced in January 2019. The deadline for FRTB implementation is January 2022.

Considering the main changes to the standards published in January 2019, Vosloo noted changes to the FRTB Standardised Approach (SA) covering the calibration of risk weights, risk buckets, FX triangulation and the curvature measure. Amendments to the Internal Model Approach (IMA) include changes in the Profit & Loss Attribution Test and tweaks to the observable pricing requirements associated with Non-Modellable Risk Factors (NMRFs).

Based on these changes he suggested firms should prioritise implementation of the SA, which must be in place by 2021 at firms within the EU; achieving a uniform approach to data across the front office, risk and finance; working with external data suppliers on observable pricing for NMRFs; implementing desk structure in terms of IMA versus SA approaches; winning senior management buy-in to make necessary changes; and discussing approval of approaches with regulators.

The elephant in the room, he concluded, is US regulators that have published nothing, but have said they intend to implement FRTB, which would be a significant challenge for US banks.

Data sourcing and management challenges

Expert panels following the keynote dived into the details of data sourcing for FRTB compliance and best practices approaches to the regulation’s data management challenges.

The data sourcing panel kicked off with an audience poll asking participants which model approach they are taking to assess trading book capital usage. The IMA won out, with 28% saying they will use IMA on all desks and 39% on selected desks. A further 33% remain undecided and none expect to use only the SA.

With the audience setting a high bar on data sourcing through the use of the IMA, the panel – moderated by independent financial regulatory specialised Selwyn Blair Ford, and joined by Suman Datta, head, portfolio quantitative research at Lloyds Banking Group;

Jerry Goddard, former director of traded risk at Santander UK; Bradley Foster, global head of content, Enterprise Data at Bloomberg; and Martijn Groot, vice president, product management at Asset Control – discussed the complexity of running the IMA approach and the simpler SA approach. Groot noted that the SA is rules-based and prescriptive, allowing firms to use a data services solution. The IMA requires a more bespoke response with clients needing to source data both internally and externally.

Goddard commented: “Making the choice of which model to use is difficult, but whichever you choose take FRTB as an opportunity to understand your business model and create a good data model for the business. If you do this, you will be able to make better choices. Just running the numbers will lead to a mess.” Foster added: “If you are a large bank, the IMA is a huge data management burden, but the ability to normalise and link data together will extend way beyond FRTB compliance.”

Considering whether FRTB requires new golden source data, Datta said: “There is advantage in using FRTB as a means to look at how data flows and is updated and then do more to make golden source data by reorganising data, using what you have and looking to vendors to fill gaps.” Groot noted: “The price of inconsistent data is going up dramatically. Providing golden source data from the back-office through to the front-office is a pragmatic approach.”

The panel noted the data challenges of elements of the regulation such as the Risk Factor Eligibility Test (FRET) and NMRFs, both of which need historical yet difficult to source data, and concluded that internal data needs to be supplemented with external data. As Foster put it: “The amount of data needed to resolve the problem is huge.”

Segueing into the second panel, an early poll considering the toughest data management challenges of FRTB showed 50% of respondents noting the Profit & Loss Attribution Test, 33% NMRFs, 8% RFET, and 8% the large volumes of data that must be managed to achieve compliance.

This panel was moderated by David Kelly, co-founder and managing director at Quant Foundry; and joined by Adolfo Montoro, director, global head of market data strategy and analytics at  Deutsche Bank; Jacob Rank Broadley, director, regulatory and market structure propositions, at Refinitiv; Jerry Goddard, former director of traded risk at Santander UK; and Satinder Jandu, director at Viewset.

Discussing how to build a framework for FRTB compliance, Goddard noted initial difficulty in winning senior management buy-in as the immediacy of the regulation has waned with the implementation date being pushed back to 2022 with reporting starting in 2023. That said, he promoted using the SA deadline, which kicks in before the IMA deadline, to get as many changes in place as possible and ready for the introduction of the IMA.

There is also a need to manage proxies, which will be a large part of determining capital requirements, and map instruments to risk factors. Jandu said: “FRTB pushes risk to the front. You need to build a risk factor framework and work out the taxonomy of risk factors.”

Rank Broadley identified financial institutions setting conservative thresholds for capital, but operations looking at how to get the right data to reduce NMRFs and taking a tactical approach to what can be achieved. On this point, Goddard commented: “You need to understand if NMRFs are important to you and then get the data. ‘Everything in the model’ is not the best approach as more NMRFs mean a larger capital requirement.”

Looking ahead, and noting the expected implementation of FRTB in the US, Montoro noted the need to develop building blocks for the different model approaches that will serve both EU and US requirements. Jandu concluded: “We need to make FRTB a strategic enabler.”

Leave a comment

Your email address will not be published. Required fields are marked *


Related content


Recorded Webinar: Data lineage – how to ensure you can deliver the right information, to the right people, at the right time

Data lineage is critical to digital transformation, business decisions and regulatory compliance. It is also difficult to implement at scale, not only because large quantities of data across numerous systems must be inventoried and tracked, but also because the data is not static and needs context to make sense to the business. If you are...


Refinitiv Adds Regulatory Reporting Solutions to Verified Entity Data as a Service

Refinitiv continues to build out the Verified Entity Data as a Service (VEDaaS) technology it acquired with Avox back in March 2017 with the addition of modular, enhanced data services supporting the entity data requirements of regulatory reporting. The company has released modules for FCA transaction reporting, European Markets Infrastructure Regulation (EMIR), Dodd-Frank, AnaCredit, and...


Data Management Summit New York City

Now in its 8th year, the Data Management Summit (DMS) in NYC explores the shift to the new world where data is redefining the operating model and firms are seeking to unlock value via data transformation projects for enterprise gain and competitive edge.


Entity Data Management Handbook – Fifth Edition

Welcome to the fifth edition of A-Team Group’s Entity Data Management Handbook, sponsored for the fourth year running by entity data specialist Bureau van Dijk, a Moody’s Analytics Company. The past year has seen a crackdown on corporate responsibility for financial crime – with financial firms facing draconian fines for non-compliance and the very real...