About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Embracing the Known in FRTB: Why Banks Need to Step Away from the Data Pool and Start with the Familiar

Subscribe to our newsletter

By: Charlie Browne, Head of Market & Risk Data Solutions, GoldenSource.

The Fundamental Review of the Trading Book (FRTB) is coming and it has sent firms into a spin around how to get the data required to prove risk factor modellability. It is the first time banks will be obligated to do this, a mammoth undertaking that has seen many look to data pooling, where banks, data vendors, exchanges and trade repositories combine all their data to ensure a robust number of transactions have taken place previously.

It’s a convincing proposition; banks simply do not have enough of their own data. Add to this the fact that data is very expensive, and that the majority of firms are keen to consolidate costs after several heavy years of regulatory demands, and the initial attraction is clear.

The problem is that firms, at such an early stage of preparations, are getting bogged down in the many intricacies and unknowns of the data pool concept. Would a single vendor become a one stop shop or would banks be reluctant to rely on a single source and instead spread the risk by enlisting multiple vendors? Then there is the question of who will be responsible for working out if a risk factor is modellable or not, and whether the data pool itself is prepared to face potential questioning from regulators down the line.

Instead of getting stuck on the unknowns of risk factor modellability and data pooling, firms need to take a step back and see the bigger picture of a much wider reaching set of rules. FRTB was broadly designed to address the shortcomings of Basel 2.5, which failed to solve many key structural deficiencies in the market risk framework. Ultimately, the base intention of this regulation is much bigger than risk factor modellability: firms need to make a fundamental review of their data strategy.

They can begin to approach this task by getting the right data processes in place at the outset of FRTB preparations. This means having accurate and accessible market and risk datasets, and the right systems in place to run and interpret all of the calculations. By beginning with such a data-centric approach, firms can ensure that they are ready to meet massive potential challenges around aspects such as time-series cleansing, instrument lineage and single identifiers, to name but a few. And they might be pleasantly surprised by the benefits that fall out of the right FRTB strategy.

That is, if you get your data strategy right for FRTB then you will automatically address the data requirements for a lot of other regulations. For example, BCBS 239, Prudential Valuations and CCAR. This is a massive opportunity for firms to evaluate their entire data infrastructure and ensure they are taking a broader approach to regulation, rather than addressing different directives in silos.

As with any new regulation, the temptation with FRTB is for banks to focus largely on the aspects that are completely new and unknown. This is why the conversation around data pools as a solution to non-modellable risk factors has become so prominent. Firms that put too much time and resource into addressing this one single aspect could be missing a trick. In many ways, FRTB is a catalyst for compliance teams to take a step back, take stock, and put together a comprehensive data strategy that protects them against multiple regulatory requirements, and future-proofs them for years to come.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practices for creating an effective data quality control framework

Data quality is critical to capital markets processes from identifying counterparties to building customer relationships, regulatory reporting, and ultimately improving the bottom line. It can also be extremely difficult to achieve. One solution is a data quality control framework that includes an automated and systematic process that monitors the state of data quality and ensures...

BLOG

Risks and Concerns with Generative AI – What Financial Institutions Need to Consider

By Jennifer Clarke, Head of Content, Global Relay. With use cases for AI becoming more commonplace, many compliance teams have begun to employ AI-driven tools to assist with ordinary tasks including regulatory change management and surveillance. However, as with most new technologies, as innovation around generative AI increases, various new risks and challenges emerge. Potential...

EVENT

Data Management Summit New York City

Now in its 12th year, the Data Management Summit (DMS) in New York brings together the North American, capital markets enterprise data management community, to explore the evolution of data strategy and how to leverage data to drive compliance and business insight.

GUIDE

ESG Handbook 2023

The ESG Handbook 2023 edition is the essential guide to everything you need to know about ESG and how to manage requirements if you work in financial data and technology. Download your free copy to understand: What ESG Covers: The scope and definition of ESG Regulations: The evolution of global regulations, especially in the UK...