The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Liquidity Risk Regulation: Data Quality Under the Microscope

The UK Financial Services Authority’s (FSA) liquidity risk regime has set a precedent for tackling this particularly thorny area of risk management, ahead of the wider implementation of Basel III globally. Reference Data Review speaks to a number of risk technology vendors about the data quality issues underlying the liquidity risk management challenge in light of the UK experience.

Some industry participants are keen to provide their input on the data management challenges to regulators  and vendors within the risk management solution community are also well aware of these challenges, given their position as outsourced service providers between the regulator and the industry in the reporting space. They themselves have been required to get ready for the regime by converting to XML submissions for data transmission to the FSA for reporting purposes, among other things.

Compliance with the liquidity risk requirements has not been a simple process thus far for anyone. As Mario Onorato, senior director and head of balance sheet and capital management at risk technology vendor Algorithmics, explains, institutions have taken a wide array of actions over the last two years in response to the liquidity environment.

“Institutions have strengthened their liquidity risk management function, enhanced liquidity stress testing, maintained liquid asset portfolios, improved liquidity management policy, increased coordination between treasury and risk management, revised contingency funding strategy and diversified funding sources. In addition, they have increased coordination between liquidity and capital planning and improved their analysis of contingent and off balance sheet positions,” he elaborates.

“In addition, we observe that banks are trying to break all silos, offer a holistic view of the company’s objectives, integrate all types of risk (credit, market, operational, asset liability management and liquidity) and create an overall picture of the organisation for both internal and external reporting,” he continues.

It is this holistic view of data across silos that has brought data quality and data management under the regulatory microscope. The focus on “consistency” and “reliability” of data inputs into risk calculations and scenario testing by the FSA has compelled many to kick off projects in this space, such as the Royal Bank of Scotland’s One Risk programme, for instance. It is not just supporting liquidity risk management that is the focus of projects such as these however, given the frequent references to data quality within much of the regulation that has been issued over the last couple of years.

James Babicz, head of risk at SAS UK, reckons that the holistic and granular management of risk data is perhaps one of the most important lessons learned within the risk management space over the last couple of years. “Banks have moved to data quality and granularity as a critical step towards moving to a holistic view of all risks, not just liquidity. Without this first step, the necessary data aggregations cannot be done, regulators cannot be satisfied and accurate liquidity risk assessments cannot be made,” he warns.

Selwyn Blair-Ford, head of global regulatory policy at solution vendor FRSGlobal, agrees that a number of data management lessons have been learned. “We have found that one of the main lessons that firms have learned is that getting clean and consistent data that meets the requirements of the liquidity reporting regime was found to be much more difficult than initially expected,” he explains. “One initially expects that the data is there, but then on investigation discovers that 95% of business activities are covered, and the remaining 5% is the most expensive to collect. Then a firm might also find that there is a piece of information not requested before which can mean whole processes need to be adapted to be able to comply.”

Lessons may have been learned thus far but a lot of work is still to be done, says Babicz: “The bigger organisations are having the hardest job in meeting the requirements. In order to make accurate assessments on liquidity, a lot of groundwork needs to be done in aggregating data. The siloed nature of many large organisations makes this extremely challenging, so we are also seeing many of the larger organisations investing both time and money getting their data in order, just so that they can begin to tackle the new requirements.”

Data aggregation is also more than just bringing all data together from different parts of the business, it also requires getting that data to work far harder and to a standard able to meet the increasingly complex reporting requirements, Babicz contends.

Both quantity and quality is important here because firms now need to be able to report in far more detail, by business line, product type or currency, for example. Stress testing practices have been a particular game changer in terms of data quality issues, according to Babicz. Firms may have been using stress tests for years, but what the market has witnessed recently, however, is many investing in new reporting systems that help organisations better aggregate their data in order to get a far more granular understanding of risk across the business, he explains.

To support this, there is a much greater impact at getting quality data down to the record of source.

“Here we see firms looking at a top down approach rather that then traditional bottom up approaches which have caused the majority of headaches for large banks in the past. In other words, banks first start with the metrics they wish to generate to run their business and figure out where they need to gain the data needed to support them, rather than trying to gather all of the data into a massive enterprise wide data warehouse,” says Babicz.

This is potentially why some firms are declaring the ‘death’ of the single golden copy approach. FRSGlobal’s Blair-Ford reckons that the UK’s experiences and the learning curve thus far will feed into the global debate.

“Globally, Basel III and the other liquidity regimes have benefitted from the experiences of the UK regulator and UK-based firms. Careful reading of the Basel III requirements show plenty of cases where the FSA’s influence is evident – one cannot help but note that if a solution fits the UK regime it will – in the main – fit requirement of the regulators as far flung as Australia and Canada,” he says.

In terms of the future, Algorithmics’ Onorato adds: “Finally, we should note that it is very unlikely that Basel III will be the answer to all previous problems, and therefore institutions must retain flexibility to accommodate years of fine tuning and future reforms.”

Related content

WEBINAR

Recorded Webinar: Getting ready for Sustainable Finance Disclosure Regulation (SFDR) and ESG – what action should asset managers be taking now?

Interest in Environmental, Social and Governance (ESG) investment has exploded in recent years, bringing with it regulation and a requirement for buy-side firms to develop ESG strategies and meet disclosure obligations. The sell-side can help here by integrating ESG data with traditional financial information, although the compliance burden remains with asset managers. The EU Sustainable...

BLOG

FCA Cracks Down on OMS Reporting Errors: Regulated Firms Pay the Price

By Matt Smith, CEO, SteelEye. Certain Order Management Systems (OMSs) have recently come under scrutiny from the FCA because of quality issues around MiFIR reporting. Firms that heavily rely on their OMS for daily regulatory reporting have been found to consistently over or under report their transactions. The responsibility for accurate reporting rests solely with...

EVENT

RegTech Summit New York City

Now in its 5th year, the RegTech Summit in NYC explores how the North American financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...