About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Opinion – Tougher Stress Tests Bring Banks to an Impasse: Revolutionise or Fail

Subscribe to our newsletter

By Peter Ku, Senior Director of Global Financial Services, Informatica

Just before Christmas, the results of the first stress testing exercise for the UK banking system were revealed. The test scenarios modelled a financial doomsday, in which the housing market crashes, unemployment spikes and inflation rises. The aim of the exercise was to determine which banks had the capital adequacy to survive extreme fluctuations in market conditions.

Despite the Co-operative Bank failing – and both Lloyds and RBS narrowly escaping a similar black mark – the supervising bodies are confident that the UK banking system has become significantly more resilient. However, with fears around the Eurozone rising, fluctuating oil prices and uncertainty over interest rates, financial institutions cannot rest on their laurels and assume that they’ll pass the same tests again next year.

Regulators have already set out their intentions to widen the remit of stress testing and will use the next year to probe even deeper for cracks in the defences of the UK banks. For those banks that girded themselves strongly enough to withstand the first round of stress testing, the threat of tougher test conditions and scenarios are just around the corner.

Searching for Stress Fractures

UK banks are already gearing up for these tougher tests. Santander passed this year’s tests with flying colours. But the possibility of stress testing against leverage ratio, a measure of the banks ability to meet financial obligations, could cause concern. As a result, Santander UK has been handed £300 million by its Spanish parent to bolster the bank’s capital position further.

However, the advent of more robust tests will heap pressure on those banks whose approach to compliance is now more puncture-patch than tyre. In the last banking crisis, a lot of banks were unaware of their exposure to risk. As a result, it’s no surprise that Mark Carney, the governor of the Bank of England, has already indicated that overseas risk could also be a factor of the next test. This scenario could prove to be much more challenging for banks who do a lot of their business in Asia.

Today’s financial institutions must be able to determine exposures across the business in real-time. To do this they have to have the right information when lending to new customers and understand how much risk they are exposed to across different divisions. The latter can only be achieved by being able to bring together data from multiple sources.

For example, a mortgage loan packaged as a mortgage bank security is traded on the public market. Banks need the ability to understand who that mortgage belongs to and their credibility so that they have an accurate view of counter-party risk exposure. In turn, this means that banks need to understand: what are the credit conditions locally? What are the interest rate fluctuations in other countries? Who is the customer?

Reaching an Impasse

Bank stress testing is a fact of life for the global financial industry to help regulators monitor and measure systemic risk across markets. Satisfying these demands requires banks to have adequate means of accessing and utilising the right data for ongoing risk management and compliance. More importantly, allowing firms to take advantage of best practices and technology that allows for efficient and effective execution. However, the amount of data being produced versus the amount that needs to be captured, consumed, analysed and understood is a challenge.

Banks can no longer just continue appointing more people or building specific systems for each regulation. The increased pressure from regulators is already beginning to uproot the existing gaps between people, technology and architectures across traditional business silos that have prevented an enterprise wide view of risk.

With each round of regulation and impeding deadlines, has come another tacked-on solution and another patched-up process. This approach has built a creaking, siloed and unwieldy infrastructure for managing data, reporting and compliance. A system that means stress tests are as much a test of reporting process as they are of actual ability to withstand tough economic conditions. This chronic underinvestment in people and rudimentary tools has brought banks to an impasse: revolutionise or fail, extremely publicly.

The Meta-Data Trail

In light of this, managing data must be taken seriously and regulators are forcing the board’s hand in this matter. An organisation can’t continue to blame their risk and compliance applications for not being able to meet stress-testing demands. The root of all pains is often the underlying foundation for managing the data that feeds those systems. Regulators require detailed explanations about the lineage of data and how information is captured, transformed, and calculated so that they can understand the amount of capital set aside to cover market, credit, operational, and liquidity volatility and risk.

Increasingly, the regulators are defining how data should be managed and governed, as evidenced in BCBS 239. This could mark the beginning of more stringent audits and requirements for firms globally. If banks are to satisfy regulator demands a comprehensive, scalable, enterprise wide data management platform will be a crucial investment to create the transparency, trust, speed and data audit trail required. The success of such a platform has a number of technological and process driven dependencies. To deliver on its promise it must be supported with access to all enterprise data, have rigorous data quality management enabled by data governance and master data management, as well as integrating with business and technical metadata.

The difference in approach comes down to not treating the regulations as yet another separate IT project or system. Instead banks must establish a common framework for risk, compliance, sales, marketing, and customer facing systems and operations. Adopting this approach will help provide insight into where risk exposure is and ensures transparency in how data is handled from the source all the way to the reporting function. This will also help reduce the costs and risks of managing a ‘hairball’ of separate systems and integrations.

Overhauling a data infrastructure may seem a daunting prospect to banks built on a creaking tower of legacy technology. However the persistence and growing power of the regulators mean that the current patchwork and siloed approach cannot continue. Financial institutions need to look at the underlying systems to ensure that they can give the regulators the information they need on an ongoing basis. If they are to achieve this, banks can no longer manage risk behind the Chinese walls of business unit siloes.

The good news is that support for data management projects is gathering pace at the board level. Even better news is that such a project delivers above and beyond the short-term needs of providing better access to data. Investment in a common data management platform can provide the means of avoiding patchwork systems and one-off processes of the past. Through this banks can establish a foundation to serve the data needs of the enterprise for compliance, growth, and cost reduction.

2015 will be a year of change for banks and their data strategies but one thing’s for certain: the puncture-patch approach to compliance has reached the end of the road.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: An update on data standards and global identifiers

Data standards and global identifiers have been parts of capital markets’ practices for many years, and more are being developed, reviewed and shaped as the industry acknowledges their role in streamlining data management, reducing risk, improving transparency, and achieving compliance. This webinar will discuss data standards and identifiers in play, as well as those in...

BLOG

Datactics Brings AI-Powered Augmented Data Quality Solution to Market

Datactics, a provider of data quality software solutions, has productised the Augmented Data Quality Solution (ADQ) it previewed in June 2023 and is working with customers on an upgrade rollout. The solution is designed to make faster and more efficient AI-powered data quality accessible and beneficial to all through an enriched user interface and more...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...