A-Team Insight Blogs

How to Meet the Data Rules and Requirements of Fundamental Review of the Trading Book Regulation

Implementation of Fundamental Review of the Trading Book (FRTB) regulation is complex, demanding and time consuming, but can be achieved by setting priorities, working with external data vendors, and taking best practice approaches to the data management challenges presented by the regulation.

A well-attended A-Team Group briefing hosted at the Glaziers Hall in London, Meeting the Data Requirements of FRTB, discussed these issues, set out priorities and timelines, offered solutions and provided a forum to answer financial institutions’ outstanding questions on the regulation.

An opening poll asking audience members where they are in terms of meeting FRTB compliance, showed 46% making good progress, 38% having made a start, 8% in the planning stage, and 8% not yet started.

Bearing these results in mind, Neels Vosloo, head of regulatory risk EMEA at Bank of America Merrill Lynch, presented a comprehensive keynote covering the rules and requirements of FRTB and the priorities to meet them. He highlighted the progression of FRTB from an initial version in 2010, to publication of revised standards for minimum capital requirements for market risk in 2016, a consultation on these standards in 2018, and final rules based on the consultation announced in January 2019. The deadline for FRTB implementation is January 2022.

Considering the main changes to the standards published in January 2019, Vosloo noted changes to the FRTB Standardised Approach (SA) covering the calibration of risk weights, risk buckets, FX triangulation and the curvature measure. Amendments to the Internal Model Approach (IMA) include changes in the Profit & Loss Attribution Test and tweaks to the observable pricing requirements associated with Non-Modellable Risk Factors (NMRFs).

Based on these changes he suggested firms should prioritise implementation of the SA, which must be in place by 2021 at firms within the EU; achieving a uniform approach to data across the front office, risk and finance; working with external data suppliers on observable pricing for NMRFs; implementing desk structure in terms of IMA versus SA approaches; winning senior management buy-in to make necessary changes; and discussing approval of approaches with regulators.

The elephant in the room, he concluded, is US regulators that have published nothing, but have said they intend to implement FRTB, which would be a significant challenge for US banks.

Data sourcing and management challenges

Expert panels following the keynote dived into the details of data sourcing for FRTB compliance and best practices approaches to the regulation’s data management challenges.

The data sourcing panel kicked off with an audience poll asking participants which model approach they are taking to assess trading book capital usage. The IMA won out, with 28% saying they will use IMA on all desks and 39% on selected desks. A further 33% remain undecided and none expect to use only the SA.

With the audience setting a high bar on data sourcing through the use of the IMA, the panel – moderated by independent financial regulatory specialised Selwyn Blair Ford, and joined by Suman Datta, head, portfolio quantitative research at Lloyds Banking Group;

Jerry Goddard, former director of traded risk at Santander UK; Bradley Foster, global head of content, Enterprise Data at Bloomberg; and Martijn Groot, vice president, product management at Asset Control – discussed the complexity of running the IMA approach and the simpler SA approach. Groot noted that the SA is rules-based and prescriptive, allowing firms to use a data services solution. The IMA requires a more bespoke response with clients needing to source data both internally and externally.

Goddard commented: “Making the choice of which model to use is difficult, but whichever you choose take FRTB as an opportunity to understand your business model and create a good data model for the business. If you do this, you will be able to make better choices. Just running the numbers will lead to a mess.” Foster added: “If you are a large bank, the IMA is a huge data management burden, but the ability to normalise and link data together will extend way beyond FRTB compliance.”

Considering whether FRTB requires new golden source data, Datta said: “There is advantage in using FRTB as a means to look at how data flows and is updated and then do more to make golden source data by reorganising data, using what you have and looking to vendors to fill gaps.” Groot noted: “The price of inconsistent data is going up dramatically. Providing golden source data from the back-office through to the front-office is a pragmatic approach.”

The panel noted the data challenges of elements of the regulation such as the Risk Factor Eligibility Test (FRET) and NMRFs, both of which need historical yet difficult to source data, and concluded that internal data needs to be supplemented with external data. As Foster put it: “The amount of data needed to resolve the problem is huge.”

Segueing into the second panel, an early poll considering the toughest data management challenges of FRTB showed 50% of respondents noting the Profit & Loss Attribution Test, 33% NMRFs, 8% RFET, and 8% the large volumes of data that must be managed to achieve compliance.

This panel was moderated by David Kelly, co-founder and managing director at Quant Foundry; and joined by Adolfo Montoro, director, global head of market data strategy and analytics at  Deutsche Bank; Jacob Rank Broadley, director, regulatory and market structure propositions, at Refinitiv; Jerry Goddard, former director of traded risk at Santander UK; and Satinder Jandu, director at Viewset.

Discussing how to build a framework for FRTB compliance, Goddard noted initial difficulty in winning senior management buy-in as the immediacy of the regulation has waned with the implementation date being pushed back to 2022 with reporting starting in 2023. That said, he promoted using the SA deadline, which kicks in before the IMA deadline, to get as many changes in place as possible and ready for the introduction of the IMA.

There is also a need to manage proxies, which will be a large part of determining capital requirements, and map instruments to risk factors. Jandu said: “FRTB pushes risk to the front. You need to build a risk factor framework and work out the taxonomy of risk factors.”

Rank Broadley identified financial institutions setting conservative thresholds for capital, but operations looking at how to get the right data to reduce NMRFs and taking a tactical approach to what can be achieved. On this point, Goddard commented: “You need to understand if NMRFs are important to you and then get the data. ‘Everything in the model’ is not the best approach as more NMRFs mean a larger capital requirement.”

Looking ahead, and noting the expected implementation of FRTB in the US, Montoro noted the need to develop building blocks for the different model approaches that will serve both EU and US requirements. Jandu concluded: “We need to make FRTB a strategic enabler.”

Leave a comment

Your email address will not be published. Required fields are marked *

*

Share article

Related content

WEBINAR

Recorded Webinar: High Performance Technologies for Electronic Execution

Don’t miss this opportunity to view the recording of this recently held webinar. When firms pursue higher performance in the electronic trading execution technology they use, they must consider the exchange and trading venue landscape as the context for their approach. Their approach can be accomplished in-house or with support from providers, which can be...

BLOG

A-Team Group Names Winners of RegTech Awards 2018

The winners of A-Team Group’s RegTech Awards 2018 have been revealed, with winning solution and service providers named across categories from best data management solution for regulatory compliance to best reference data for regulatory compliance, best trade repository for regulatory disclosure, best vendor solution for data governance, and best compliance as a service solution. As...

GUIDE

Enterprise Data Management, 2009 Edition

This year has truly been a year of change for the data management community. Regulators and industry participants alike have been keenly focused on the importance of data with regards to compliance and risk management considerations. The UK Financial Services Authority’s fining of Barclays for transaction reporting failures as a result of inconsistent underlying reference...