The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

FSA’s Lawrence Warns Firms of the Importance of Accurate Funds Transfer Pricing for Risk Management

Enterprise risk management is at the heart of the funds transfer pricing (FTP) challenge, explained Colin Lawrence, director of the Prudential Risk Division of the UK Financial Services Authority (FSA) to attendees at the JWG FTP event in London earlier this week. All risk types are closely linked and must be treated as such in order to properly evaluate FTP, which is the mechanism by which a firm indicates the costs, benefits and risks of liquidity to a particular business line.

FTP is essentially an internal measurement and allocation process via which a firm assigns a profit contribution value to funds it has gathered and lent or invested. It is also a key focus of the FSA’s liquidity risk regime, under which firms must hold liquidity buffers and regularly stress test these buffers. The FTP measurement is also used to determine an instrument’s valuation, so that liquidity costs are quantified for individual securities. As noted by Lawrence: “Firms need to be able to price liquidity risk correctly in order to stay in business.”

He also indicated that the FSA has recently brought on board a whole host of new, industry sourced hires (bringing the total to around 250) in order to be able to accurately evaluate firms’ risk management practices. Lawrence himself was appointed back in April 2008, before which he worked for IBM in its Risk Enterprise business. He has also previously worked on the banking side of the fence at Republic National Bank, Barclays and UBS, in roles including the risk management function.

At the moment, the industry has a long way to go to get liquidity risk management and FTP right, according to Lawrence. “If capital is the heart of your business, then liquidity is the supply of blood,” he said, indicating the seriousness of the issue. The benefit of including FTP is that it aligns business line behaviour with a firm’s risk appetite and strategic objectives, he added.

Lawrence’s comments come as no surprise, given that earlier this year, a survey by the Professional Risk Managers’ International Association (PRMIA) indicated that many firms have yet to put in place adequate procedures and systems to cope with liquidity risk stress testing procedures. The FSA has set a 14 December deadline for firms by which point they must have made the necessary changes to their quantitative and qualitative systems for risk management in order to be able to conduct stress testing.

The correct calculation of FTP could also potentially result in the reduction of the liquid asset buffer that is required to be held by financial institutions. Whereas the consequence of poor FTP is the misallocation of liquidity resources, potentially resulting in lopsided balance sheet growth, excess off balance sheet exposures, the misalignment of risk and reward and poor interest rate and FX hedging, warned Lawrence. He also cautioned that FTP should not be done as an average, rather it should be calculated for each individual security.

Lawrence stressed the need for counterparty risk measurements to feed into the FTP process, along with credit and market risk calculations. “All risk types are inextricably linked and therefore should not be considered in isolation,” he said. “All of this also obviously needs proper methodology, systems and governance.”

So, it seems that the FSA is aware of the silo problem within most financial institutions but is very much looking for the industry to respond with a more holistic approach to risk management. All of this is underpinned with the need for data to be dealt with in an integrated fashion to support the risk calculation methodologies, which must, in turn, feed into the valuation of individual instruments.

Lawrence concluded by speaking about the need for an “optimal granularity of data”, which involves firms being able to aggregate the relevant data from across their businesses but also to be able to drill down into that data in order to provide transparency to regulators and the business. Endless data warehouses on top of each other are not enough to solve the problem, intelligent use of data flows and architecture is needed in order to solve this challenge.

Related content

WEBINAR

Recorded Webinar: The UK’s New Prudential Regime for Investment Firms – Time to Prepare!

With the implementation of the new Investment Firms Prudential Regime (IFPR), the FCA is aiming to streamline and simplify the prudential requirements for solo-regulated investment firms in the UK. Under the new regime, all MiFID authorized, Collective Portfolio Management Investment Firms (i.e. UK UCITS ManCo and Alternative Investment Fund Management Firms permitted to undertake Additional Activities)...

BLOG

Lack of Equivalence Post Brexit Raises Costly Data Management Concerns

The UK has finally left the European Union. A trade deal was wrung out at the eleventh hour, allowing the two sides to trade with zero tariffs and quotas, but the financial services industry has been left hanging – with talk of an MoU around the regulation of financial services being reached by March 2021...

EVENT

LIVE Briefing: ESG Data Management – A Strategic Imperative

This breakfast briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...