The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Liquidity Risk Regulation: Data Quality Under the Microscope

The UK Financial Services Authority’s (FSA) liquidity risk regime has set a precedent for tackling this particularly thorny area of risk management, ahead of the wider implementation of Basel III globally. Reference Data Review speaks to a number of risk technology vendors about the data quality issues underlying the liquidity risk management challenge in light of the UK experience.

Some industry participants are keen to provide their input on the data management challenges to regulators  and vendors within the risk management solution community are also well aware of these challenges, given their position as outsourced service providers between the regulator and the industry in the reporting space. They themselves have been required to get ready for the regime by converting to XML submissions for data transmission to the FSA for reporting purposes, among other things.

Compliance with the liquidity risk requirements has not been a simple process thus far for anyone. As Mario Onorato, senior director and head of balance sheet and capital management at risk technology vendor Algorithmics, explains, institutions have taken a wide array of actions over the last two years in response to the liquidity environment.

“Institutions have strengthened their liquidity risk management function, enhanced liquidity stress testing, maintained liquid asset portfolios, improved liquidity management policy, increased coordination between treasury and risk management, revised contingency funding strategy and diversified funding sources. In addition, they have increased coordination between liquidity and capital planning and improved their analysis of contingent and off balance sheet positions,” he elaborates.

“In addition, we observe that banks are trying to break all silos, offer a holistic view of the company’s objectives, integrate all types of risk (credit, market, operational, asset liability management and liquidity) and create an overall picture of the organisation for both internal and external reporting,” he continues.

It is this holistic view of data across silos that has brought data quality and data management under the regulatory microscope. The focus on “consistency” and “reliability” of data inputs into risk calculations and scenario testing by the FSA has compelled many to kick off projects in this space, such as the Royal Bank of Scotland’s One Risk programme, for instance. It is not just supporting liquidity risk management that is the focus of projects such as these however, given the frequent references to data quality within much of the regulation that has been issued over the last couple of years.

James Babicz, head of risk at SAS UK, reckons that the holistic and granular management of risk data is perhaps one of the most important lessons learned within the risk management space over the last couple of years. “Banks have moved to data quality and granularity as a critical step towards moving to a holistic view of all risks, not just liquidity. Without this first step, the necessary data aggregations cannot be done, regulators cannot be satisfied and accurate liquidity risk assessments cannot be made,” he warns.

Selwyn Blair-Ford, head of global regulatory policy at solution vendor FRSGlobal, agrees that a number of data management lessons have been learned. “We have found that one of the main lessons that firms have learned is that getting clean and consistent data that meets the requirements of the liquidity reporting regime was found to be much more difficult than initially expected,” he explains. “One initially expects that the data is there, but then on investigation discovers that 95% of business activities are covered, and the remaining 5% is the most expensive to collect. Then a firm might also find that there is a piece of information not requested before which can mean whole processes need to be adapted to be able to comply.”

Lessons may have been learned thus far but a lot of work is still to be done, says Babicz: “The bigger organisations are having the hardest job in meeting the requirements. In order to make accurate assessments on liquidity, a lot of groundwork needs to be done in aggregating data. The siloed nature of many large organisations makes this extremely challenging, so we are also seeing many of the larger organisations investing both time and money getting their data in order, just so that they can begin to tackle the new requirements.”

Data aggregation is also more than just bringing all data together from different parts of the business, it also requires getting that data to work far harder and to a standard able to meet the increasingly complex reporting requirements, Babicz contends.

Both quantity and quality is important here because firms now need to be able to report in far more detail, by business line, product type or currency, for example. Stress testing practices have been a particular game changer in terms of data quality issues, according to Babicz. Firms may have been using stress tests for years, but what the market has witnessed recently, however, is many investing in new reporting systems that help organisations better aggregate their data in order to get a far more granular understanding of risk across the business, he explains.

To support this, there is a much greater impact at getting quality data down to the record of source.

“Here we see firms looking at a top down approach rather that then traditional bottom up approaches which have caused the majority of headaches for large banks in the past. In other words, banks first start with the metrics they wish to generate to run their business and figure out where they need to gain the data needed to support them, rather than trying to gather all of the data into a massive enterprise wide data warehouse,” says Babicz.

This is potentially why some firms are declaring the ‘death’ of the single golden copy approach. FRSGlobal’s Blair-Ford reckons that the UK’s experiences and the learning curve thus far will feed into the global debate.

“Globally, Basel III and the other liquidity regimes have benefitted from the experiences of the UK regulator and UK-based firms. Careful reading of the Basel III requirements show plenty of cases where the FSA’s influence is evident – one cannot help but note that if a solution fits the UK regime it will – in the main – fit requirement of the regulators as far flung as Australia and Canada,” he says.

In terms of the future, Algorithmics’ Onorato adds: “Finally, we should note that it is very unlikely that Basel III will be the answer to all previous problems, and therefore institutions must retain flexibility to accommodate years of fine tuning and future reforms.”

Related content


Upcoming Webinar: The UK’s New Prudential Regime for Investment Firms – Time to Prepare!

Date: 23 March 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes With the implementation of the new Investment Firms Prudential Regime (IFPR), the FCA is aiming to streamline and simplify the prudential requirements for solo-regulated investment firms in the UK. Under the new regime, all MiFID authorized, Collective Portfolio Management Investment...


Thomson Reuters Report on ‘Fintech, RegTech and the Role of Compliance’ Finds Increased Use of Technology During Pandemic

The Thomson Reuters’ Fintech, RegTech and the Role of Compliance Report 2021, released this week, finds that the adoption and implementation of technology has taken a huge step forward during the pandemic, despite the continuing budget challenges that firms face. Sector growth is expected to accelerate in the coming months and years, predicts the report,...


Data Management Summit London

The Data Management Summit Virtual explores how financial institutions are shifting from defensive to offensive data management strategies, to improve operational efficiency and revenue enhancing opportunities. We’ll be putting the business lens on data and deep diving into the data management capabilities needed to deliver on business outcomes.


Regulatory Data Handbook 2020/2021 – Eighth Edition

This eighth edition of A-Team Group’s Regulatory Data Handbook is a ‘must-have’ for capital markets participants during this period of unprecedented change. Available free of charge, it profiles every regulation that impacts capital markets data management practices giving you: A detailed overview of each regulation with key dates, data and data management implications, links to...