About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Feature: Sizing the Problem and Finding Solutions for Risk Data Aggregation

Subscribe to our newsletter

Risk data aggregation became a hot topic after the financial crash in 2008, when it became clear that banks did not have the necessary risk information readily available to understand their exposures to counterparties and inform regulators of their positions. Senior managers were making poor decisions based on poor data and supervisory regulators failed to identify and address large concentrations of risk taken on by some banks.

In the wake of the crash, the need to rectify these types of problems became paramount and regulators issued a raft of regulations, including the Basel III Accord, Dodd Frank Act, European Market Infrastructure Regulation, Markets in Financial Instruments II, Common Reporting and Financial Reporting, with a view to increasing control, reducing risk and improving transparency across capital markets.

To fulfil the data management issues underlying these regulations, the Basel Committee introduced the BCBS 239 Principles for Risk Aggregation and Reporting. The compliance deadline for BCBS 239 is January 1st 2016 and one specific aim of the regulation is to automate risk data aggregation. In turn, this will support accurate, complete and timely risk data reporting.

While automated risk data aggregation is a must for global systemically important banks, which are the first tranche of banks subject to BCBS 239, it is also fundamental to all banks that must meet the risk management requirements of other regulations. Taking a wider business approach, successful risk data aggregation is not only important to regulatory compliance and avoiding penalties for non-compliance, but also to gaining a clear view of risk across the organisation. This will support benefits such as a better customer experience, improved business decisions based on accurate and timely information, reduced capital requirements and operational costs, and ultimately, increased profitability.

You can find more detail about risk data aggregation and BCBS 239 in A-Team Group’s BCBS 239 Handbook and recent white paper, Navigating BCBS 239 and the New Stress-Testing Regime.

The challenge

For many financial firms, the biggest challenge to achieving seamless risk data aggregation is the problem of data silos that have been built over time to support specific business lines or are the result of mergers and acquisitions. Multiple legacy systems, incomplete data architectures and manual intervention in risk and reporting processes add to the problem, along with inconsistent terminology that is used to categorise and classify data and makes it difficult to integrate and aggregate datasets quickly and efficiently across business lines.

Discussing these types of problems at a recent A-Team Data Management Summit in New York City, Tom Stock, senior vice president of product management at GoldenSource, said: “The process of aggregating risk data is a challenge for large organisations. They need to understand their data elements and have a comprehensive data dictionary that covers all operational systems that generate data as well as all data domains across areas such as customer data, security masters, counterparty data, positions and transactions data. This data is often in different operational and transaction systems supporting different asset classes. Being able to understand and cross-reference the data is a big challenge and to do this firms need to manage the lifecycle of data and cleanse, normalise and combine it to provide a single view of risk that can be used across the organisation.”

Data quality is also a potential problem as disparate data silos are likely to manage risk data at different levels of granularity and accuracy. The traditional divide between risk and finance functions adds to the challenge as it leaves a lack of integrated infrastructure suitable to efficient and effective risk data aggregation. The combination of these issues can result in slow and sometimes incomplete or inaccurate risk data aggregation, outcomes that pose the same questions about the veracity of a firm’s risk exposure that risk data aggregation is supposed to answer.

The Solution

Financial firms are addressing the challenge of risk data aggregation in different ways. Some are taking a short-term tactical approach, others a longer-term strategic approach, but wherever firms start, risk data aggregation will be an ongoing process rather than a point solution.

A strategic approach needs to consider the challenges described by Stock and implement best practice that includes policies covering data management and risk data aggregation, practical implementation of data architecture, data dictionaries and consistent data definitions, a data quality management and remediation process, and a cultural shift to encourage understanding of risk data and ownership across the organisation.

Picking up on some of these points, Maryann Houglet, information strategy and architecture, Tata Consultancy Services global consultancy practice, suggested at the Data Management Summit that a strategic approach to risk data aggregation could include a data strategy for risk management at a senior executive level, capability to manage dynamic data and cultural change to support centralised risk data management.

State of Play

While many banks are still struggling with the first steps of strategy and data management for risk data aggregation, it is becoming a ‘must do’. Global systemically important banks must demonstrate their aggregation capabilities as part of compliance with BCBS 239 next month, and domestic systemically important banks will become subject to the regulation three years after designation as systemically important banks. Other banks are also expected to chase the benefits of risk data aggregation and respond to market demand for increased transparency. While this is the goal, most banks are starting small and thinking big, building out risk data aggregation in phases as they move towards full regulatory compliance or develop data aggregation projects while ensuring an understanding of risk data across the organisation.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Best practices for creating an effective data quality control framework

Date: 8 November 2022 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data quality is critical to capital markets processes from identifying counterparties to building customer relationships, regulatory reporting, and ultimately improving the bottom line. It can also be extremely difficult to achieve. One solution is a data quality control framework...

BLOG

A-Team Innovation Briefing Discusses Data Mesh, Transition to Cloud, Use of AI

A-Team Group’s Innovation Briefing series got off to a great start in London last week with an event dedicated to innovation in cloud. The briefing included practitioner interviews, provided insight into how to modernise data infrastructure, and reviewed technologies driving change. The case for data mesh Kicking off the briefing, and raising several delegate questions,...

EVENT

TradingTech Insight Briefing New York

TradingTech Insight Briefing New York will explore how trading firms are innovating and leveraging technology as a differentiator in today’s cloud and digital based environment.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...