Risk data aggregation became a hot topic after the financial crash in 2008, when it became clear that banks did not have the necessary risk information readily available to understand their exposures to counterparties and inform regulators of their positions. Senior managers were making poor decisions based on poor data and supervisory regulators failed to identify and address large concentrations of risk taken on by some banks.
In the wake of the crash, the need to rectify these types of problems became paramount and regulators issued a raft of regulations, including the Basel III Accord, Dodd Frank Act, European Market Infrastructure Regulation, Markets in Financial Instruments II, Common Reporting and Financial Reporting, with a view to increasing control, reducing risk and improving transparency across capital markets.
To fulfil the data management issues underlying these regulations, the Basel Committee introduced the BCBS 239 Principles for Risk Aggregation and Reporting. The compliance deadline for BCBS 239 is January 1st 2016 and one specific aim of the regulation is to automate risk data aggregation. In turn, this will support accurate, complete and timely risk data reporting.
While automated risk data aggregation is a must for global systemically important banks, which are the first tranche of banks subject to BCBS 239, it is also fundamental to all banks that must meet the risk management requirements of other regulations. Taking a wider business approach, successful risk data aggregation is not only important to regulatory compliance and avoiding penalties for non-compliance, but also to gaining a clear view of risk across the organisation. This will support benefits such as a better customer experience, improved business decisions based on accurate and timely information, reduced capital requirements and operational costs, and ultimately, increased profitability.
You can find more detail about risk data aggregation and BCBS 239 in A-Team Group’s BCBS 239 Handbook and recent white paper, Navigating BCBS 239 and the New Stress-Testing Regime.
For many financial firms, the biggest challenge to achieving seamless risk data aggregation is the problem of data silos that have been built over time to support specific business lines or are the result of mergers and acquisitions. Multiple legacy systems, incomplete data architectures and manual intervention in risk and reporting processes add to the problem, along with inconsistent terminology that is used to categorise and classify data and makes it difficult to integrate and aggregate datasets quickly and efficiently across business lines.
Discussing these types of problems at a recent A-Team Data Management Summit in New York City, Tom Stock, senior vice president of product management at GoldenSource, said: “The process of aggregating risk data is a challenge for large organisations. They need to understand their data elements and have a comprehensive data dictionary that covers all operational systems that generate data as well as all data domains across areas such as customer data, security masters, counterparty data, positions and transactions data. This data is often in different operational and transaction systems supporting different asset classes. Being able to understand and cross-reference the data is a big challenge and to do this firms need to manage the lifecycle of data and cleanse, normalise and combine it to provide a single view of risk that can be used across the organisation.”
Data quality is also a potential problem as disparate data silos are likely to manage risk data at different levels of granularity and accuracy. The traditional divide between risk and finance functions adds to the challenge as it leaves a lack of integrated infrastructure suitable to efficient and effective risk data aggregation. The combination of these issues can result in slow and sometimes incomplete or inaccurate risk data aggregation, outcomes that pose the same questions about the veracity of a firm’s risk exposure that risk data aggregation is supposed to answer.
Financial firms are addressing the challenge of risk data aggregation in different ways. Some are taking a short-term tactical approach, others a longer-term strategic approach, but wherever firms start, risk data aggregation will be an ongoing process rather than a point solution.
A strategic approach needs to consider the challenges described by Stock and implement best practice that includes policies covering data management and risk data aggregation, practical implementation of data architecture, data dictionaries and consistent data definitions, a data quality management and remediation process, and a cultural shift to encourage understanding of risk data and ownership across the organisation.
Picking up on some of these points, Maryann Houglet, information strategy and architecture, Tata Consultancy Services global consultancy practice, suggested at the Data Management Summit that a strategic approach to risk data aggregation could include a data strategy for risk management at a senior executive level, capability to manage dynamic data and cultural change to support centralised risk data management.
State of Play
While many banks are still struggling with the first steps of strategy and data management for risk data aggregation, it is becoming a ‘must do’. Global systemically important banks must demonstrate their aggregation capabilities as part of compliance with BCBS 239 next month, and domestic systemically important banks will become subject to the regulation three years after designation as systemically important banks. Other banks are also expected to chase the benefits of risk data aggregation and respond to market demand for increased transparency. While this is the goal, most banks are starting small and thinking big, building out risk data aggregation in phases as they move towards full regulatory compliance or develop data aggregation projects while ensuring an understanding of risk data across the organisation.