The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

FSA’s Lawrence Warns Firms of the Importance of Accurate Funds Transfer Pricing for Risk Management

Enterprise risk management is at the heart of the funds transfer pricing (FTP) challenge, explained Colin Lawrence, director of the Prudential Risk Division of the UK Financial Services Authority (FSA) to attendees at the JWG FTP event in London earlier this week. All risk types are closely linked and must be treated as such in order to properly evaluate FTP, which is the mechanism by which a firm indicates the costs, benefits and risks of liquidity to a particular business line.

FTP is essentially an internal measurement and allocation process via which a firm assigns a profit contribution value to funds it has gathered and lent or invested. It is also a key focus of the FSA’s liquidity risk regime, under which firms must hold liquidity buffers and regularly stress test these buffers. The FTP measurement is also used to determine an instrument’s valuation, so that liquidity costs are quantified for individual securities. As noted by Lawrence: “Firms need to be able to price liquidity risk correctly in order to stay in business.”

He also indicated that the FSA has recently brought on board a whole host of new, industry sourced hires (bringing the total to around 250) in order to be able to accurately evaluate firms’ risk management practices. Lawrence himself was appointed back in April 2008, before which he worked for IBM in its Risk Enterprise business. He has also previously worked on the banking side of the fence at Republic National Bank, Barclays and UBS, in roles including the risk management function.

At the moment, the industry has a long way to go to get liquidity risk management and FTP right, according to Lawrence. “If capital is the heart of your business, then liquidity is the supply of blood,” he said, indicating the seriousness of the issue. The benefit of including FTP is that it aligns business line behaviour with a firm’s risk appetite and strategic objectives, he added.

Lawrence’s comments come as no surprise, given that earlier this year, a survey by the Professional Risk Managers’ International Association (PRMIA) indicated that many firms have yet to put in place adequate procedures and systems to cope with liquidity risk stress testing procedures. The FSA has set a 14 December deadline for firms by which point they must have made the necessary changes to their quantitative and qualitative systems for risk management in order to be able to conduct stress testing.

The correct calculation of FTP could also potentially result in the reduction of the liquid asset buffer that is required to be held by financial institutions. Whereas the consequence of poor FTP is the misallocation of liquidity resources, potentially resulting in lopsided balance sheet growth, excess off balance sheet exposures, the misalignment of risk and reward and poor interest rate and FX hedging, warned Lawrence. He also cautioned that FTP should not be done as an average, rather it should be calculated for each individual security.

Lawrence stressed the need for counterparty risk measurements to feed into the FTP process, along with credit and market risk calculations. “All risk types are inextricably linked and therefore should not be considered in isolation,” he said. “All of this also obviously needs proper methodology, systems and governance.”

So, it seems that the FSA is aware of the silo problem within most financial institutions but is very much looking for the industry to respond with a more holistic approach to risk management. All of this is underpinned with the need for data to be dealt with in an integrated fashion to support the risk calculation methodologies, which must, in turn, feed into the valuation of individual instruments.

Lawrence concluded by speaking about the need for an “optimal granularity of data”, which involves firms being able to aggregate the relevant data from across their businesses but also to be able to drill down into that data in order to provide transparency to regulators and the business. Endless data warehouses on top of each other are not enough to solve the problem, intelligent use of data flows and architecture is needed in order to solve this challenge.

Related content

WEBINAR

Upcoming Webinar: A new way of collaborating with data

Date: 15 April 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Digital transformation in the financial services sector has raised many questions around data, including the cost and volume of reference data required by each financial institution. Firms want to pick and choose the reference data they need to fulfil...

BLOG

Alveo Adds Postgres to Roadmap Combining Open Source Components with Core Data Management Solutions

Alveo, formerly Asset Control, continues to build its commitment to open source solutions with the addition of support for the Postgres open source relational database within its Prime, formerly AC Plus, financial data aggregation and mastering product. Postgres provides an optional replacement for Oracle database technology used by Alveo and its clients, and can reduce...

EVENT

RegTech Summit London

Now in its 5th year, the RegTech Summit in London explores how the European financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2020/2021 – Eighth Edition

This eighth edition of A-Team Group’s Regulatory Data Handbook is a ‘must-have’ for capital markets participants during this period of unprecedented change. Available free of charge, it profiles every regulation that impacts capital markets data management practices giving you: A detailed overview of each regulation with key dates, data and data management implications, links to...