About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Basel II and the Law of Unintended Consequence: Leading to Direct Financial Benefit

Subscribe to our newsletter

By Rick Wilson, vice president of product strategy for Trillium Software

Not every result from the ‘law of unintended consequences” is a negative outcome. For example, it’s unlikely that the Basel regulations, set up to monitor the accuracy of financial performance reporting, were implemented with the deliberate intention of improving banks’ bottom lines. But that is exactly what is happening. The positive impact on institutions has been clear and compelling in places where the right business teams have been created to establish and measure business data and credit risk data accuracy.

The immediate result from this new level of engagement by business teams has been the discovery of numerous opportunities to improve institutions’ operational performance by realising the financial impact of data gaps and errors. Given the proper environment for discovery, focused teams are able to spot spurious values, incorrect patterns and gaps in data processes which lead to punitive increases in risk reserve and capital adequacy calculations. New tools created to evaluate the accuracy of data used in risk and capital reserve calculations have actually led to the discovery of huge reserves of ‘captive capital’ in financial institutions that tie up dollars which could be used for investment and business needs.

To reduce these excess captive reserves, the data upon which risk calculations are made must be provably correct, current, comprehensive and consistent – and validated by the business. It’s here where increased regulation, and the demand for demonstrating transparent processes for measuring and reporting on data quality, has created the ‘unintended consequence’ of direct financial benefit.

Creating the ‘information environment’ to support business understanding and certification of data accuracy requires systems that enable more flexible and ‘inquiry based’ access paths to data. The traditional method of queries, spreadsheets and static BI reporting do not serve this purpose effectively. Fragmentation and lack of coherence in current processes limit the benefits obtainable from a true ‘business view’ of data. The way data attains its ‘currency’ as consumable business information follows a different path from the data management requirements for operational processing.

The paths of business research and validation are also quite different from the ‘stream management steps’ of operational data management. Business users need to be able to selectively ‘step into the stream’ to explore aspects of data that are not part of its original intended purpose. They need to discover things about the data extracted from multiple sources that have a material impact on risk calculations. Creating the proper framework of the information resource to support these endeavours is critical to success.

But there must also be practical and pragmatic controls in this environment because the time individuals spend participating in this research is of a very high premium. Defined and appropriate views are of paramount importance, and must provide enough of a ‘context description’ to enable people from the “shop floor” to identify critical issues which have meaningful impact to risk calculations.

The system must also be able to operate with a set of business rules that are quite different from the rules developed for monitoring operational data quality. The ‘business metrics’ that examine data for impact on risk calculations often need to utilise multiple tests on a single data item, and enable the prioritisation and ‘layering’ of data quality issues. These metrics must also be capable of combining and prioritising the impact of several data items to determine ‘threshold’ numbers of allowed data errors to be tracked.

The opportunity for business users to build optimal processes that enable firms to become proactive participants in the data accuracy validation process is exciting. It demonstrates the ability to positively impact capital adequacy and risk reserve requirements by ensuring data is accurate in actual business context.

As banks continue to develop processes to analyse the accuracy of the underlying data used to support risk reserve and capital adequacy calculations, they are building teams that bring together knowledge of both business and technology perspectives to produce dramatic results with measurable bottom line benefits. The dynamic of this new platform for understanding, which demonstrates the financial impact that data can have on risk operations, continues to grow, and improvements will continue to produce clear financial benefits. These exciting results engage true business ‘buy in’ and strong executive sponsorship. The consequence from business understanding of the benefits of improved data quality are very deliberate and ‘intentional’ – better results and stronger operational performance.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to maximise the use of data standards and identifiers beyond compliance and in the interests of the business

Date: 18 July 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data standards and identifiers have become common currency in regulatory compliance, bringing with them improved transparency, efficiency and data quality in reporting. They also contribute to automation. But their value does not end here, with data standards and identifiers...

BLOG

BNP Paribas Becomes First EU G-SIB to Join GLEIF Validation Agent Programme

The Global Legal Entity Identifier Foundation (GLEIF) continues to build out the Global LEI System (GLEIS) with the addition of BNP Paribas as a Validation Agent. The addition of BNP Paribas marks the first global systemically important bank (G-SIB) headquartered in the EU to join the Validation Agent programme. Most recently, the GLEIF added Nord...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Entity Data Management Handbook – Fifth Edition

Welcome to the fifth edition of A-Team Group’s Entity Data Management Handbook, sponsored for the fourth year running by entity data specialist Bureau van Dijk, a Moody’s Analytics Company. The past year has seen a crackdown on corporate responsibility for financial crime – with financial firms facing draconian fines for non-compliance and the very real...