The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Basel II and the Law of Unintended Consequence: Leading to Direct Financial Benefit

By Rick Wilson, vice president of product strategy for Trillium Software

Not every result from the ‘law of unintended consequences” is a negative outcome. For example, it’s unlikely that the Basel regulations, set up to monitor the accuracy of financial performance reporting, were implemented with the deliberate intention of improving banks’ bottom lines. But that is exactly what is happening. The positive impact on institutions has been clear and compelling in places where the right business teams have been created to establish and measure business data and credit risk data accuracy.

The immediate result from this new level of engagement by business teams has been the discovery of numerous opportunities to improve institutions’ operational performance by realising the financial impact of data gaps and errors. Given the proper environment for discovery, focused teams are able to spot spurious values, incorrect patterns and gaps in data processes which lead to punitive increases in risk reserve and capital adequacy calculations. New tools created to evaluate the accuracy of data used in risk and capital reserve calculations have actually led to the discovery of huge reserves of ‘captive capital’ in financial institutions that tie up dollars which could be used for investment and business needs.

To reduce these excess captive reserves, the data upon which risk calculations are made must be provably correct, current, comprehensive and consistent – and validated by the business. It’s here where increased regulation, and the demand for demonstrating transparent processes for measuring and reporting on data quality, has created the ‘unintended consequence’ of direct financial benefit.

Creating the ‘information environment’ to support business understanding and certification of data accuracy requires systems that enable more flexible and ‘inquiry based’ access paths to data. The traditional method of queries, spreadsheets and static BI reporting do not serve this purpose effectively. Fragmentation and lack of coherence in current processes limit the benefits obtainable from a true ‘business view’ of data. The way data attains its ‘currency’ as consumable business information follows a different path from the data management requirements for operational processing.

The paths of business research and validation are also quite different from the ‘stream management steps’ of operational data management. Business users need to be able to selectively ‘step into the stream’ to explore aspects of data that are not part of its original intended purpose. They need to discover things about the data extracted from multiple sources that have a material impact on risk calculations. Creating the proper framework of the information resource to support these endeavours is critical to success.

But there must also be practical and pragmatic controls in this environment because the time individuals spend participating in this research is of a very high premium. Defined and appropriate views are of paramount importance, and must provide enough of a ‘context description’ to enable people from the “shop floor” to identify critical issues which have meaningful impact to risk calculations.

The system must also be able to operate with a set of business rules that are quite different from the rules developed for monitoring operational data quality. The ‘business metrics’ that examine data for impact on risk calculations often need to utilise multiple tests on a single data item, and enable the prioritisation and ‘layering’ of data quality issues. These metrics must also be capable of combining and prioritising the impact of several data items to determine ‘threshold’ numbers of allowed data errors to be tracked.

The opportunity for business users to build optimal processes that enable firms to become proactive participants in the data accuracy validation process is exciting. It demonstrates the ability to positively impact capital adequacy and risk reserve requirements by ensuring data is accurate in actual business context.

As banks continue to develop processes to analyse the accuracy of the underlying data used to support risk reserve and capital adequacy calculations, they are building teams that bring together knowledge of both business and technology perspectives to produce dramatic results with measurable bottom line benefits. The dynamic of this new platform for understanding, which demonstrates the financial impact that data can have on risk operations, continues to grow, and improvements will continue to produce clear financial benefits. These exciting results engage true business ‘buy in’ and strong executive sponsorship. The consequence from business understanding of the benefits of improved data quality are very deliberate and ‘intentional’ – better results and stronger operational performance.

Related content

WEBINAR

Recorded Webinar: Sanctions – The new pre-trade challenge for the buy-side

Sanctions screening at the security level is a relatively recent requirement for the buy-side. It dives deeper than traditional KYC and AML screening and is immensely challenging as firms must monitor frequently changing sanctions lists, source up-to-date sanctions data and beneficial ownership data, and integrate these to screen growing lists of potentially sanctioned securities. As...

BLOG

SETL Launches Verafide Digital Credentials Platform for KYC, Other Verification Challenges

Blockchain operator SETL’s Verafide subsidiary has launched its open-source platform for verifying digital credentials and identification on the SETL enterprise blockchain, allowing issuers, holders and verifiers to set up and maintain a digital credentials ecosystem. The Verafide platform is aimed at helping firms fulfill their KYC obligations, adopt more seamless and transparent customer onboarding processes...

EVENT

ESG Data & Tech Summit 2022

This summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...