About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

FRTB Calculation Debates Centre on Style, When They Should be Focusing on Substance

Subscribe to our newsletter

By Tim Versteeg, Managing Director of APAC at NeoXam.

It’s been a turbulent few years for financial institutions when it comes to regulations, and 2020 is no different. With FRTB set to come in at the end of the year, financial institutions are devoting significant time and resources to comply with the regulation. For the vast majority of institutions, the bulk of their time is spent balancing their capital between the Standardised Approach (SA) and the Internal Models Approach (IMA).

Which approach suits best depends on the size of the bank. For smaller players, the lower capital requirements they could achieve with IMA are not justified compared to the high infrastructure costs in terms of systems, data and processes.  The SA approach also has synergies with risk calculations for other regulations, such as upcoming Initial Margin (IM) requirements. For institutions who will be pulled into the upcoming tranche next September, it makes sense for them to include SA calculations in their FRTB preparations.

However, banks with larger trading books which include instruments or assets with significant price variation, will benefit from the IMA approach.

While more costly in terms of infrastructure, an approved IMA model means the capital adequacy reserves needed to be put aside would be lower. This means that for larger players, the long-term benefits outweigh the initial high cost.

What companies preoccupied with this decision seem to be forgetting is a key piece of the puzzle: their data. A company’s decisions are only as good as the data they are based on.

If the appropriate data processes are not put in place, changing from managing one year of historical data to 10 will not only be an extremely time-intensive process but also an incredibly expensive one. Firms need act now to pre-empt this problem and get their data in order.

The first step is to ensure that everything is bucketed correctly. Whether a firm chooses SA or IMA, what is crucial is ensuring the data forming the foundation of these calculations is reliable and can be accessed quickly.

Automation will also be essential. Manually handling data processes not only takes up a huge amount of time but is practically impossible if this has to be done for 10 years’ worth of historical data.

The ramifications of keeping processes manual go beyond time-wasting. Under BCBS 239, we have seen regulators coming down more frequently on firms who leave their compliance open to human error. With the much larger data sets in scope for FRTB, even semi-automated processes would have to be automated.

Having a robust data foundation and automated data processes won’t only help firms with FRTB, but also other regulations such as IM and ad-hoc regulatory stress testing. As a result, putting the time and effort in now to get your data in order will see multiple compliance benefits, as well as preparing companies well for future regulatory changes.

Banks who get their house in order now, will benefit both from avoiding compliance problems and added advantages to their businesses across the board. Automating the data will allow employees to focus their time on how to get the most out the huge amounts of data now available to them, rather than continuing to waste valuable time waiting for batch processes to finish.

The true issue that banks must solve is not whether they choose SA or IMA but how they can best automate their processes. This will allow them to not only efficiently comply with FRTB, but truly add value to their businesses with the extra data required by the regulation.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to leverage Generative AI and Large Language Models for regulatory compliance

Generative AI (GenAI) and Large Language Models (LLMs) offer huge potential for change across capital markets, not least in regulatory compliance where they have the capability to help firms understand and interpret regulations, automate compliance, monitor transactions in real time, and flag anomalies in the same timeframe. They also present challenges including explainability, responsibility, model...

BLOG

Corlytics Reports Eye-Watering Fines for 2023 Regulatory Breaches

Corlytics, a provider of regulatory risk intelligence, has released an enforcement data report for 2023 revealing financial crime, data protection, and governance as the main risk categories for financial services with the highest penalties. Some $6.7 billion of fines were imposed for financial crime, most of which were for money laundering and terrorist financing. Looking...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Entity Data Management

Entity data management has historically been a rather overlooked area of the reference data landscape, but with the increase focus on managing risk, the industry is finally taking notice. It is now generally agreed to be critical to every financial institution; although the rewards for investment in entity data management appear to be rather small,...