About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

VIRTUAL ROUNDTABLE: Fundamental Review of the Trading Book (FRTB) – The when, the why and the how…?

Subscribe to our newsletter

With implementation just around the corner, FRTB should be at the top of every bank’s agenda. In advance of A-Team Group’s hotly anticipated Breakfast Briefing on May 14, we gathered together a group of experts to discuss the essential issues and top priorities that firms should be focusing on in the run-up to D-Day…

One of the most complicated regime changes to hit financial regulation in decades, putting in place the right strategy, the right systems and the right software will be crucial for institutions to avoid punitive capital charges and retain competitive advantage. But with huge volumes of transaction and pricing data to process, new methods of calculation to integrate and far greater levels of granularity to achieve, time is running out – and firms who have not yet started may already be behind.

So what are the secrets to successful strategic implementation? A-Team Group is delighted to present a selection of unique insights from five leading FRTB technology solutions providers: discussing the pitfalls, problems, pressures and priorities firms face in the run-up to January 2022.

Tim Lind, Managing Director, DTCC Data Services

Ian Kessell, Managing Director (Australia), KRM22

Martijn Groot, Vice-President of Marketing & Strategy, Asset Control

Vuk Magdelinic, CEO, Overbond

Chris Mureen, Chief Operating Officer, McObject

What are the top priorities to hit the FRTB implementation timeframe? Where should firms be by now and at what stage should they start to worry?

Martijn: The top priorities to make the January 2022 deadline (with enough time to parallel-run prior to that) are to focus on having the data collection, aggregation, mastering and risk calculation capabilities in place. This covers data availability aspects such as sufficient historical data and real-price observations, process aspects such as the ability to show data lineage, consistent market data preparation processes as well as the computational power to run scenarios and calculate the different Expected Shortfall metrics. By now firms should have a clear picture of the target organisational structure, the data governance aspects and the target data flows. Firms should worry when they haven’t completed this picture or when they miss fundamental infrastructure capabilities to deal with FRTB data volumes and back-testing requirements. For example, if they have multiple market data sourcing and calculation streams that effectively preclude a consistent comparison between front and middle office.

Tim: The immediate priorities that the industry should focus on are: 1) Completion of banks’ risk management infrastructure and implementation of their targeted approach including the tactical tools to address business case questions related to investments in the Internal Model Approach (IMA) and participate in ongoing industry Quantitative Impact Study (QIS) activities. 2) Banks will need to finalize their portfolio analysis, determining which trading desk will go with Standard Approach (SA) or IMA. 3) Market Risk and Data Management teams within the banks should complete a data sourcing strategy including data sourced internally as well as externally. Any material delays in completing the above actions could create cascading impacts. What is not abundantly clear from our view, as we approach the go-live compliance date, is if the regulators will have sufficient time to review all the banks’ applications for IMA. Will banks submit for approvals well in advance or will it be a “FRTB Black Friday” with a last-minute rush for approvals? If banks are not making aggressive attempts to progress their programs today with the goal of being first in the queue for approvals, it could put them at risk later if their initial IMAs are rejected and they need to re-apply.

At this stage banks should have a preferred vendor, a resource team and the allocated project budget in order to be in a position to quickly progress the implementation.

Ian: All banks need to have Standardised Approach (SA) for calculating risk-weighted assets for market risk as outlined by the revised FRTB rules ready by 1 January 2022. Banks intending to only use the SA should already be looking at the software options to ensure they are prepared to comply with the new requirements. Firms intending to use the FRTB Internal Model Approach (IMA) need to have their projects running by now. At the very least they should have already undertaken a review of each trading desk to understand the financial impact of FRTB and considerations for any associated restructure. Although the deadline is almost three years away, banks are required to demonstrate a full year of system performance and its results in order to be accredited. This means that their respective FRTB solutions need to be implemented by mid-2020 at the very latest in order to allow sufficient time to undergo this mandatory approval process. FRTB is a significant burden for most banks, spanning business, operations and technology functions, so critically assessing resourcing needs is vital. Banks will be seeking resources from a limited talent pool and in competition with other firms. Designated project managers should be in place by now to ensure the internal infrastructure, data and processes are being evaluated. In addition, ensuring that banks have selected and secured a commitment from their preferred third party provider is also key, bearing in mind that vendors will face pressure to deliver solutions to multiple banks in parallel and therefore may need to be selective with regards to their partnerships. At this stage banks should have a preferred vendor, a resource team and the allocated project budget in order to be in a position to quickly progress the implementation.

Chris: The need for clean, consistent data – prices, bid, ask, etc. across the entire Trading Book is vitally important, and firms should be worried now if they haven’t yet implemented this.

Vuk: Smaller buy-side firms need to worry immediately as the government is not looking to extend the deadline again. The fixed income side is also much harder to report on, as liquidity flags are problematic for the illiquid corporate bond market place.

What should firms be doing to address data management requirements around elements such as Non-Modellable Risk Factors, Profit and Loss Attribution Tests, Risk Factor Eligibility Tests and Expected Shortfall?

A successful data acquisition and management strategy needs to be a primary focus for senior executives in order to achieve the best optimization of capital

Tim: A successful data acquisition and management strategy needs to be a primary focus for senior executives in order to achieve the best optimization of capital. Banks must be able to source reliable, granular, and high-quality data. Without such a focus on data management, banks’ ability to understand their capital requirements will become blurry, resulting in less than optimal balance sheet performance. In addition, regulators will want to observe good controls and management of bank data governance, and will require banks to provide traceability between their models and the data inputs into them. Banks should consider their external data strategy as well. DTCC has been advising to our FRTB program pilot members that their third-party data sourcing strategy should be complete by the end of Q4 2019, thus allowing them to leverage comprehensive real price observations beyond using their own data to understand the IMA impacts.

Martijn: Firms should prioritize taking stock of what data they have available currently, both externally sourced as well as their own trade data, to ascertain where the gaps are. For the Market Data RFET framework this means extracting risk factors, linking real prices to risk factors across sources and periodically evaluating modellability. This requires a clear process for data mapping between instrument level and risk factor level. Integration capabilities to aggregate different trade-level data sets that are available on the market will also be very helpful. Cross-referencing to standard taxonomies and analytical capabilities for e.g. risk factor decomposition are also required. For PLA consistency, the focus is not so much a matter of having enough granular data but more a case of consistency in the data management process when there are multiple streams of capture (same snap times), sources (market makers, brokers, carriers) and calculation method (curve, surface other risk factor calculation parameters). Discrepancies can easily lead to issues in reconciling hypothetical P&L with risk-theoretical P&L.

Ian: Banks should be analysing their existing market data quality and ensure there is a golden source for input into their FRTB solution. Where there are identifiable gaps, firms should be actively seeking market data vendors’ assistance to deliver the required data for NMRFs and RFET. For P&L attribution tests, the requirements have been modified, and the impact reduced in the final draft of the requirements. Nevertheless, firms must have an extensive understanding of their P&L processes, and data input to generate hypothetical P&L for feeding into their FRTB IMA system. Expected shortfall is a variation of Value-at-Risk (VaR) which should be familiar to firms. The IMA requires multiple time horizons, multiple attributions across asset classes and inclusion of stress periods for these simulations. These multiply typical current system runs by up to 50 times, meaning that any legacy VaR systems will be unlikely to meet the operational and reporting requirements in a timely manner.

Vuk: We have worked on solutions where several financial institutions pooled data to arrive at the external proxy which eases the requirement for comprehensive non-modellable risk factor analysis in-house. We are a big proponent of a collaborative approach between financial institutions in the peer-group struggling with FRTB requirements.

How realistic (and how costly) will it be for companies to attempt the Internal Model Approach (IMA) and what should they be doing by now in order to achieve this?

Chris: Only the largest banks have the resources to achieve this, and if they are attempting it, they should have resolved the data issues by now and be testing the models.

Firms using internal systems need to have already selected and be in the process of implementing their chosen solution to support IMA.

Ian: For medium to large firms there are FRTB cloud solutions that can help accelerate the implementation and meet the performance requirements to achieve an FRTB IMA model, even at this stage. Firms using internal systems need to have already selected and be in the process of implementing their chosen solution to support IMA. Alternatively, firms can adopt new technology and leverage cloud-based software and market data services in combination with trade and reference data extraction from bank data warehouses to create an IMA solution. Cloud solutions allow for easy scaling of hardware and much better transparency and accessibility of results. Bypassing internal IT hardware and administration is likely to reduce the required time to implement an FRTB solution, which can help comply with the rules in a timely manner.

Martijn: Adopting the IMA should be a business decision weighing the capital and possibly reputational benefits versus the overhead in producing the internal model numbers. Note though that adopting the Internal Model Approach is not a binary decision but can be decided on a desk by desk basis. Banks can opt for IMA for their focal asset classes while going SBA where they have less differentiation. Firms should have done capital impact analysis and made a choice about desk organization and where to opt for IMA (if anywhere). By now, banks should have a clear picture of the suitability of their current market data management and risk calculation infrastructure. We often see gaps in capabilities when it comes to data availability, ease of data access, performance, and streamlined and consistent market data and risk factor preparation processes. Tied to that, firms often need to improve their capabilities in data lineage, audit, managing and tracking change in business logic such as proxies. Often firms have a scattered infrastructure with different sourcing and data derivation points, leading to a brittle process that is costly to maintain and that neither satisfies regulators nor the risk and reporting function.

Tim: The capital impact of operationalizing the FRTB IMA will vary significantly from bank to bank and even at the internal desk level based on trading strategies and portfolio composition. We expect all large global banks to pursue IMA and most of the large regional players as well, especially for asset classes where they have a big presence. Given the estimates of how FRTB will increase capital allocations and the impact on the P&L for individual trading desks, most banks are concluding that IMA will be less costly than the capital outcomes if they only pursue SA.

What are the key non-data requirements for FRTB (e.g., infrastructure, personnel) and what should firms be doing to tackle them?

Martijn: Key non-data requirements include a flexible calculation infrastructure, often supplied through cloud-based computing capabilities. On top of that, firms should have well-trained staff and an effective procurement process that engages multiple stakeholders. With a growing range of sourcing options in data sets, data management technology and risk engines and with banks looking to buy rather than build, good procurement that can create a best of breed infrastructure for data, market data management and exploration, risk engines and business user enablement will be a key differentiator.

Tim: We see an increase in activity across banks as they ramp up their internal program teams. This is a positive sign that they are taking implementation seriously. At this point, most banks have at least a targeted infrastructure plan. Additional complexities are introduced with banks that have gone through multiple mergers and/or trade exotic products that need to be managed by multiple proprietary risk management tools. Banks that have mature FRTB programs are making strategic decisions about how to better integrate front office, back office, and risk management infrastructure to ensure proper data lineage and controls are in place for consistent risk modelling. Given all the fragmentation in core infrastructure across trading desks and regions, FRTB has been a catalyst to modernize and integrate core systems that will have benefits beyond improved capital calibration under FRTB.

Ian: Banks will need significant personnel resources to undertake the analysis and testing, especially when opting to implement an IMA solution. However, even for SA there will be a need for a solid dedicated team comprising business and risk analysts, IT data management specialists and a knowledgeable FRTB team lead. For infrastructure which is located within the organisation’s own data centres, hardware needs to be specified and purchased. Alternatively, firms can utilise a cloud-based solution with the appropriate security and data integration.

Chris: Firms need solid, reliable, systems with “instant” failover capabilities, and similarly reliable, high-performance networks across the entire organisation. They obviously need strong quant teams in the compliance organisation as well as the trading organisation.

Where do the biggest uncertainties lie around FRTB, and what fundamental changes will firms have to make to ensure they have the firepower and infrastructure capabilities to meet these concerns (e.g. processing power? Storage? Risk architecture?)

Ian: FRTB requirements are a big challenge to firms, as they will need to change their mind-set around how they measure risk and manage risk infrastructure. At the mandatory level, the SA requires significant calculations of sensitivities, a complex aggregation process and significant amount of supplementary reference data for the Default Risk Charge to ensure correct risk weights are applied. To meet the calculation processing requirements, firms will need to ensure that the sizing of hardware is determined correctly, including processing power, storage and disaster recovery support. FRTB requires a solid risk architecture to ensure the appropriate data is supplied into the system and that the results of the calculations can be disseminated in internal reporting systems and into the appropriate external regulatory reporting process. These problems can be better managed by moving the FRTB solution to the cloud. These hardware issues become much easier to manage and can be easily supported through on-demand infrastructure. Critically important is that the correct cloud solution is identified and selected, which has been architected to leverage this on-demand infrastructure to deliver the required system performance to meet the desired operational model.

Martijn: There are some uncertainties around the pace of the legislative processes in the different jurisdictions following the final Basel text earlier this year. Fundamental changes firms have to make centre around infrastructural capabilities in sourcing, aggregating and preparing market data and position data in a consistent, performant and traceable manner. Initiatives running in parallel such as the ECB’s TRIM programme and improvements around making the most of external market data and indeed internal alignment make for a full change agenda. Firms that look at the overlap and plan for scale and easy access to consistent data for traded risk, stress testing, modelling and IPV are best prepared for the future.

Tim: The biggest uncertainties within the operationalization of banks’ FRTB programs are concentrated around cohesion of risk management tools and using common and consistent data sources. Firms will also need to look at their data storage capabilities. Fortunately, with the emergence of cloud storage and infrastructure, the industry is better positioned to leverage such technologies while benefiting from economy of scale pricing. Trying to implement such an initiative even 5 years ago would have been a great challenge without the maturity of cloud technologies.

Vuk: The biggest concern is the manual reconciliation to arrive at the liquidity and risk factor decomposition in light of external proxy not existing. It is however possible to solve this problem by applying robust risk-decomposition algorithms.

Chris: Firms will need much more processing power, and high-performance databases across the board, i.e., trading and compliance/risk functions. They will need to be able to process the huge increases in data volumes which will need to be stored and analysed. While risk is currently calculated at entity level, one of the stipulations of FRTB is for it to be calculated at desk level, resulting in an unprecedented amount of data, analysis and reporting, estimated to be up to 10x higher than it is at present.

How can software help with the above needs (both data and non-data based), and where will software have the biggest impact in helping firms meet FRTB requirements?

Software can help firms with data sourcing, mastering and cost-effectively and timely providing the risk system as well as other users with the market and reference data sets required.

Martijn: FRTB can serve as a catalyst for more streamlined data collection and integration processes. The main point of any regulation should not necessarily be the creation of a specific report or the adoption of any specific scenario or metric. Instead, firms should have the capabilities to assess their risk in a granular way from different perspectives: liquidity horizon, concentration, counterparty (type), asset class and so on. This will not only reassure investors as well as regulators, it should also enable the business itself. Software can help firms with data sourcing, mastering and cost-effectively and timely providing the risk system as well as other users with the market and reference data sets required. NoSQL database technologies can scale to handle large market data, risk factor and scenario sets. Cloud-based data integration and computation can be effective to deal with peak capacity and shield firms from a lot of small change in terms of onboarding new instruments or risk factors, tracking vendor data feeds and implementing new business rules

Tim: Depending on the specific IT needs of each bank, ROI of implementing IMA, and the complexity of their portfolios, banks’ need for third party software-as-a-solution (SaaS) to manage SA or IMA will vary. For those banks that are going forward with IMA, they will need to source third party data to ensure they have a comprehensive view of modellability. This will require banks to consolidate their own market data along with third-party data to create a centralized data lake of information.

Some legacy vendors are simply “lifting and shifting” their software onto the cloud which leaves the same old performance issues and the on-going difficulty managing upgrades and new regulatory requirements

Ian: The software will need to be designed to meet the demanding operational and reporting requirements. It should be a priority for banks to explore software either from through a vendor or internally that can be easily scalable through hardware, or better still through on-demand cloud infrastructure. Some legacy vendors are simply “lifting and shifting” their software onto the cloud which leaves the same old performance issues and the on-going difficulty managing upgrades and new regulatory requirements. Software that has been written to be cloud-native and optimised for performance using the High Performance Computing (HPC) advantages is a much safer investment.

Chris: High-performance time series database systems will make a significant impact and will provide the processing power required for the FRTB-related increase in data volumes. Good, consistent, reliable data management across the organisation will be vital.

What are the pitfalls of introducing new solutions in such a short timeframe?

Merely throwing tech at a problem is not going to do any good

Martijn: Merely throwing tech at a problem is not going to do any good. As always, carefully thinking through the requirements, not only from the regulatory methodology side (this should be in the standard offerings from solution providers) but also from the perspective of how new products and services are going to fit into a bank’s existing infrastructure. Best to look for the right mix of enabling technologies and people with a track record of successfully implementing and delivering market and risk data management solutions.

Tim: The industry as a whole needs to work in an environment that is complex and multi-faceted. The FRTB framework has been extremely complex and has involved a number of fundamental changes in approach to risk factor modeling and other factors. There are many different, parallel work-streams and banks’ program leads will need to manage each of these work systems concurrently. If not, banks will not be ready for application approval. The main pitfall has been managing many unknowns in terms of how the rules will translate into capital outcomes and uncertainty in how FRTB will be adopted in national capital regimes.

Ian: Any project, including a regulatory project with a short time-frame, will have a level of risk. There is nothing fundamentally different about FRTB, it is a business and IT project although it has prescriptive regulatory calculations requiring a high level of calculational complexity and the need for scalable high performance software, especially for IMA. In addition, the data requirements and inputs in these calculations are diverse and sourcing data and meeting the RFET requirements could be challenging to do in-house.

Hastily implemented systems always hit operational problems

Chris: The pitfalls are the same as they have been for the past 50 years. Badly (hastily) written internal systems – or little, badly, understood purchased systems. Hastily implemented systems always hit operational problems.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: ESG data sourcing and management to meet your ESG strategy, objectives and timeline

Date: 11 June 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes ESG data plays a key role in research, fund product development, fund selection, asset selection, performance tracking, and client and regulatory reporting, yet it is not always easy to source and manage in a complete, transparent and timely manner....

BLOG

Rimes Releases Data lakehouse Designed to Provide Insights from Diverse Data Sources

Rimes has released the Rimes data lakehouse, a service that combines the advantages of a data lake and a data warehouse to enable asset managers and owners to quickly access structured and unstructured data and derive valuable insights from diverse data sources. The lakehouse comprises an advanced data storage, processing and distribution platform delivered as...

EVENT

Buy AND Build: The Future of Capital Markets Technology, London

Buy AND Build: The Future of Capital Markets Technology London on September 19th at Marriott Hotel Canary Wharf London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...