About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

FSA’s Lawrence Warns Firms of the Importance of Accurate Funds Transfer Pricing for Risk Management

Subscribe to our newsletter

Enterprise risk management is at the heart of the funds transfer pricing (FTP) challenge, explained Colin Lawrence, director of the Prudential Risk Division of the UK Financial Services Authority (FSA) to attendees at the JWG FTP event in London earlier this week. All risk types are closely linked and must be treated as such in order to properly evaluate FTP, which is the mechanism by which a firm indicates the costs, benefits and risks of liquidity to a particular business line.

FTP is essentially an internal measurement and allocation process via which a firm assigns a profit contribution value to funds it has gathered and lent or invested. It is also a key focus of the FSA’s liquidity risk regime, under which firms must hold liquidity buffers and regularly stress test these buffers. The FTP measurement is also used to determine an instrument’s valuation, so that liquidity costs are quantified for individual securities. As noted by Lawrence: “Firms need to be able to price liquidity risk correctly in order to stay in business.”

He also indicated that the FSA has recently brought on board a whole host of new, industry sourced hires (bringing the total to around 250) in order to be able to accurately evaluate firms’ risk management practices. Lawrence himself was appointed back in April 2008, before which he worked for IBM in its Risk Enterprise business. He has also previously worked on the banking side of the fence at Republic National Bank, Barclays and UBS, in roles including the risk management function.

At the moment, the industry has a long way to go to get liquidity risk management and FTP right, according to Lawrence. “If capital is the heart of your business, then liquidity is the supply of blood,” he said, indicating the seriousness of the issue. The benefit of including FTP is that it aligns business line behaviour with a firm’s risk appetite and strategic objectives, he added.

Lawrence’s comments come as no surprise, given that earlier this year, a survey by the Professional Risk Managers’ International Association (PRMIA) indicated that many firms have yet to put in place adequate procedures and systems to cope with liquidity risk stress testing procedures. The FSA has set a 14 December deadline for firms by which point they must have made the necessary changes to their quantitative and qualitative systems for risk management in order to be able to conduct stress testing.

The correct calculation of FTP could also potentially result in the reduction of the liquid asset buffer that is required to be held by financial institutions. Whereas the consequence of poor FTP is the misallocation of liquidity resources, potentially resulting in lopsided balance sheet growth, excess off balance sheet exposures, the misalignment of risk and reward and poor interest rate and FX hedging, warned Lawrence. He also cautioned that FTP should not be done as an average, rather it should be calculated for each individual security.

Lawrence stressed the need for counterparty risk measurements to feed into the FTP process, along with credit and market risk calculations. “All risk types are inextricably linked and therefore should not be considered in isolation,” he said. “All of this also obviously needs proper methodology, systems and governance.”

So, it seems that the FSA is aware of the silo problem within most financial institutions but is very much looking for the industry to respond with a more holistic approach to risk management. All of this is underpinned with the need for data to be dealt with in an integrated fashion to support the risk calculation methodologies, which must, in turn, feed into the valuation of individual instruments.

Lawrence concluded by speaking about the need for an “optimal granularity of data”, which involves firms being able to aggregate the relevant data from across their businesses but also to be able to drill down into that data in order to provide transparency to regulators and the business. Endless data warehouses on top of each other are not enough to solve the problem, intelligent use of data flows and architecture is needed in order to solve this challenge.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Adding value and improving efficiencies in sanctions screening

Sanctions have been headline news this year. They are growing in number, sanctions lists are changing on a daily basis, and there can be conflict between sanctions issued by different jurisdictions – the whole calling for financial institutions to optimise sanctions screening to reduce risk and avoid potentially punitive penalties of non-compliance. This webinar will...

BLOG

ActiveNav Discusses How to Eliminate ROT and Find Value in Unstructured Sensitive Data

Sensitive data is a sensitive subject. When it is breached, fallout for the firm involved spans from reputational damage to a dent in revenue. In worst case scenarios, small companies may not be able to recover. For large financial institutions, a breach of unstructured sensitive data can be a huge reputational and financial blow as...

EVENT

RegTech Summit APAC

Now in its 2nd year, the RegTech Summit APAC will bring together the regtech ecosystem to explore how capital markets in the APAC region can leverage technology to drive innovation, cut costs and support regulatory change. With more opportunities than ever before for RegTech to add value, now is the time to invest for the future. Join us to hear from leading RegTech practitioners and innovators who will share insights into how they are tackling the challenges of adopting and implementing regtech and how to advance your RegTech strategy.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...