About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

FSA’s Lawrence Warns Firms of the Importance of Accurate Funds Transfer Pricing for Risk Management

Subscribe to our newsletter

Enterprise risk management is at the heart of the funds transfer pricing (FTP) challenge, explained Colin Lawrence, director of the Prudential Risk Division of the UK Financial Services Authority (FSA) to attendees at the JWG FTP event in London earlier this week. All risk types are closely linked and must be treated as such in order to properly evaluate FTP, which is the mechanism by which a firm indicates the costs, benefits and risks of liquidity to a particular business line.

FTP is essentially an internal measurement and allocation process via which a firm assigns a profit contribution value to funds it has gathered and lent or invested. It is also a key focus of the FSA’s liquidity risk regime, under which firms must hold liquidity buffers and regularly stress test these buffers. The FTP measurement is also used to determine an instrument’s valuation, so that liquidity costs are quantified for individual securities. As noted by Lawrence: “Firms need to be able to price liquidity risk correctly in order to stay in business.”

He also indicated that the FSA has recently brought on board a whole host of new, industry sourced hires (bringing the total to around 250) in order to be able to accurately evaluate firms’ risk management practices. Lawrence himself was appointed back in April 2008, before which he worked for IBM in its Risk Enterprise business. He has also previously worked on the banking side of the fence at Republic National Bank, Barclays and UBS, in roles including the risk management function.

At the moment, the industry has a long way to go to get liquidity risk management and FTP right, according to Lawrence. “If capital is the heart of your business, then liquidity is the supply of blood,” he said, indicating the seriousness of the issue. The benefit of including FTP is that it aligns business line behaviour with a firm’s risk appetite and strategic objectives, he added.

Lawrence’s comments come as no surprise, given that earlier this year, a survey by the Professional Risk Managers’ International Association (PRMIA) indicated that many firms have yet to put in place adequate procedures and systems to cope with liquidity risk stress testing procedures. The FSA has set a 14 December deadline for firms by which point they must have made the necessary changes to their quantitative and qualitative systems for risk management in order to be able to conduct stress testing.

The correct calculation of FTP could also potentially result in the reduction of the liquid asset buffer that is required to be held by financial institutions. Whereas the consequence of poor FTP is the misallocation of liquidity resources, potentially resulting in lopsided balance sheet growth, excess off balance sheet exposures, the misalignment of risk and reward and poor interest rate and FX hedging, warned Lawrence. He also cautioned that FTP should not be done as an average, rather it should be calculated for each individual security.

Lawrence stressed the need for counterparty risk measurements to feed into the FTP process, along with credit and market risk calculations. “All risk types are inextricably linked and therefore should not be considered in isolation,” he said. “All of this also obviously needs proper methodology, systems and governance.”

So, it seems that the FSA is aware of the silo problem within most financial institutions but is very much looking for the industry to respond with a more holistic approach to risk management. All of this is underpinned with the need for data to be dealt with in an integrated fashion to support the risk calculation methodologies, which must, in turn, feed into the valuation of individual instruments.

Lawrence concluded by speaking about the need for an “optimal granularity of data”, which involves firms being able to aggregate the relevant data from across their businesses but also to be able to drill down into that data in order to provide transparency to regulators and the business. Endless data warehouses on top of each other are not enough to solve the problem, intelligent use of data flows and architecture is needed in order to solve this challenge.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...

BLOG

The Potential and Pitfalls of Large Language Models

By Tony Seale, Knowledge Graph Engineer at Tier 1 Bank. Large Language Models (LLMs) like ChatGPT possess enormous power, stemming from their capability to ingest and compress vast amounts of general information gathered from the web. However, this capability is general rather than tailored to your specific business needs. To effectively utilise these models in...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Regulatory Data Handbook 2022/2023 – Tenth Edition

Welcome to the tenth edition of A-Team Group’s Regulatory Data Handbook, a publication that has tracked new regulations, amendments, implementation and data management requirements as regulatory change has impacted global capital markets participants over the past 10 years. This edition of the handbook includes new regulations and highlights some of the major regulatory interventions challenging...