By Rick Wilson, vice president of product strategy for Trillium Software
Not every result from the ‘law of unintended consequences” is a negative outcome. For example, it’s unlikely that the Basel regulations, set up to monitor the accuracy of financial performance reporting, were implemented with the deliberate intention of improving banks’ bottom lines. But that is exactly what is happening. The positive impact on institutions has been clear and compelling in places where the right business teams have been created to establish and measure business data and credit risk data accuracy.
The immediate result from this new level of engagement by business teams has been the discovery of numerous opportunities to improve institutions’ operational performance by realising the financial impact of data gaps and errors. Given the proper environment for discovery, focused teams are able to spot spurious values, incorrect patterns and gaps in data processes which lead to punitive increases in risk reserve and capital adequacy calculations. New tools created to evaluate the accuracy of data used in risk and capital reserve calculations have actually led to the discovery of huge reserves of ‘captive capital’ in financial institutions that tie up dollars which could be used for investment and business needs.
To reduce these excess captive reserves, the data upon which risk calculations are made must be provably correct, current, comprehensive and consistent – and validated by the business. It’s here where increased regulation, and the demand for demonstrating transparent processes for measuring and reporting on data quality, has created the ‘unintended consequence’ of direct financial benefit.
Creating the ‘information environment’ to support business understanding and certification of data accuracy requires systems that enable more flexible and ‘inquiry based’ access paths to data. The traditional method of queries, spreadsheets and static BI reporting do not serve this purpose effectively. Fragmentation and lack of coherence in current processes limit the benefits obtainable from a true ‘business view’ of data. The way data attains its ‘currency’ as consumable business information follows a different path from the data management requirements for operational processing.
The paths of business research and validation are also quite different from the ‘stream management steps’ of operational data management. Business users need to be able to selectively ‘step into the stream’ to explore aspects of data that are not part of its original intended purpose. They need to discover things about the data extracted from multiple sources that have a material impact on risk calculations. Creating the proper framework of the information resource to support these endeavours is critical to success.
But there must also be practical and pragmatic controls in this environment because the time individuals spend participating in this research is of a very high premium. Defined and appropriate views are of paramount importance, and must provide enough of a ‘context description’ to enable people from the “shop floor” to identify critical issues which have meaningful impact to risk calculations.
The system must also be able to operate with a set of business rules that are quite different from the rules developed for monitoring operational data quality. The ‘business metrics’ that examine data for impact on risk calculations often need to utilise multiple tests on a single data item, and enable the prioritisation and ‘layering’ of data quality issues. These metrics must also be capable of combining and prioritising the impact of several data items to determine ‘threshold’ numbers of allowed data errors to be tracked.
The opportunity for business users to build optimal processes that enable firms to become proactive participants in the data accuracy validation process is exciting. It demonstrates the ability to positively impact capital adequacy and risk reserve requirements by ensuring data is accurate in actual business context.
As banks continue to develop processes to analyse the accuracy of the underlying data used to support risk reserve and capital adequacy calculations, they are building teams that bring together knowledge of both business and technology perspectives to produce dramatic results with measurable bottom line benefits. The dynamic of this new platform for understanding, which demonstrates the financial impact that data can have on risk operations, continues to grow, and improvements will continue to produce clear financial benefits. These exciting results engage true business ‘buy in’ and strong executive sponsorship. The consequence from business understanding of the benefits of improved data quality are very deliberate and ‘intentional’ – better results and stronger operational performance.
Subscribe to our newsletter