About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

FSA’s Lawrence Warns Firms of the Importance of Accurate Funds Transfer Pricing for Risk Management

Subscribe to our newsletter

Enterprise risk management is at the heart of the funds transfer pricing (FTP) challenge, explained Colin Lawrence, director of the Prudential Risk Division of the UK Financial Services Authority (FSA) to attendees at the JWG FTP event in London earlier this week. All risk types are closely linked and must be treated as such in order to properly evaluate FTP, which is the mechanism by which a firm indicates the costs, benefits and risks of liquidity to a particular business line.

FTP is essentially an internal measurement and allocation process via which a firm assigns a profit contribution value to funds it has gathered and lent or invested. It is also a key focus of the FSA’s liquidity risk regime, under which firms must hold liquidity buffers and regularly stress test these buffers. The FTP measurement is also used to determine an instrument’s valuation, so that liquidity costs are quantified for individual securities. As noted by Lawrence: “Firms need to be able to price liquidity risk correctly in order to stay in business.”

He also indicated that the FSA has recently brought on board a whole host of new, industry sourced hires (bringing the total to around 250) in order to be able to accurately evaluate firms’ risk management practices. Lawrence himself was appointed back in April 2008, before which he worked for IBM in its Risk Enterprise business. He has also previously worked on the banking side of the fence at Republic National Bank, Barclays and UBS, in roles including the risk management function.

At the moment, the industry has a long way to go to get liquidity risk management and FTP right, according to Lawrence. “If capital is the heart of your business, then liquidity is the supply of blood,” he said, indicating the seriousness of the issue. The benefit of including FTP is that it aligns business line behaviour with a firm’s risk appetite and strategic objectives, he added.

Lawrence’s comments come as no surprise, given that earlier this year, a survey by the Professional Risk Managers’ International Association (PRMIA) indicated that many firms have yet to put in place adequate procedures and systems to cope with liquidity risk stress testing procedures. The FSA has set a 14 December deadline for firms by which point they must have made the necessary changes to their quantitative and qualitative systems for risk management in order to be able to conduct stress testing.

The correct calculation of FTP could also potentially result in the reduction of the liquid asset buffer that is required to be held by financial institutions. Whereas the consequence of poor FTP is the misallocation of liquidity resources, potentially resulting in lopsided balance sheet growth, excess off balance sheet exposures, the misalignment of risk and reward and poor interest rate and FX hedging, warned Lawrence. He also cautioned that FTP should not be done as an average, rather it should be calculated for each individual security.

Lawrence stressed the need for counterparty risk measurements to feed into the FTP process, along with credit and market risk calculations. “All risk types are inextricably linked and therefore should not be considered in isolation,” he said. “All of this also obviously needs proper methodology, systems and governance.”

So, it seems that the FSA is aware of the silo problem within most financial institutions but is very much looking for the industry to respond with a more holistic approach to risk management. All of this is underpinned with the need for data to be dealt with in an integrated fashion to support the risk calculation methodologies, which must, in turn, feed into the valuation of individual instruments.

Lawrence concluded by speaking about the need for an “optimal granularity of data”, which involves firms being able to aggregate the relevant data from across their businesses but also to be able to drill down into that data in order to provide transparency to regulators and the business. Endless data warehouses on top of each other are not enough to solve the problem, intelligent use of data flows and architecture is needed in order to solve this challenge.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: In data we trust – How to ensure high quality data to power AI

13 March 2025 11:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Artificial intelligence is increasingly powering financial institutions’ processes and workflows, encompassing all parts of the enterprise from front-office to the back-office. As organisations seek to gain a competitive edge, they are trialling the technology in variety of ways to streamline and...

BLOG

Broadridge Tradeverse – When a Data Lake is Not a Data Lake

Hugh Daly, Broadridge Financial Solutions’ head of capital markets data and artificial intelligence, is being mischievous when he describes the company’s latest innovation, its Tradeverse data platform. “It must sound very much like Tradeverse is a data lake – if it quacks like a data lake and walks like a data lake, fundamentally it must...

EVENT

TradingTech Summit MENA

The inaugural TradingTech Summit MENA takes place in November and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions in the region.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...