About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A Web of Complexity

Subscribe to our newsletter

By Martijn Groot, vice president of product management, Asset Control

At the centre of financial services risk management and regulatory compliance is the convergence of data from a huge range of information silos, departments, product lines, customers and risk categories. Risk and finance are at the end of an extensive chain, examining the consolidated threads, uncovering issues and spotting patterns.

This is where many organisations run into difficulty, unable to resolve the enormous data integration task. Without good data management, obtaining the right information to provide to regulators is like trawling muddy waters, there will always be uncertainty as to the resulting catch and its quality.

Harmonies across regulatory data requirements

The banking sector remains in the midst of reform, with investment in risk governance and integration continuing to rise. The volume of regulatory alerts has risen sharply in recent years. In 2008, there were 8,704 regulatory alerts; in 2015, that figure jumped to over 43,000. This equates to a piece of news on new regulation, a standards update, a QIS, a policy document or a consultation every 12 minutes.

The biggest growth in jobs is in risk professionals to deal with successive pieces of regulatory alteration. Within these successive pieces, there are common themes: different product taxonomies and client classifications; unambiguous identification; additional data context; links between related elements; and, generally, requirements on audit and lineage. In addition to regulators examining the quality of risk information chains for BCBS 239, and stress testing programmes encouraging a reconsideration of processes across silos, failure to meet the requirements of the Fundamental Review of the Trading Book (FRTB) can subject banks to increased capital charges.

The sourcing and integrating of data

The arrival of a regulatory zero tolerance approach towards poor data management means banks need to source and integrate market data efficiently, derive and track risk factor histories, and manage data quality proactively. The back end, however, is only part of the story. Equally as important is managing the consumption and standardisation of source data, and controlling the process and distribution – with transparent dashboards for quality metrics and ease of reporting.

The importance of a consistent data model to anchor the processes and business rules should not be underestimated. Sourcing clean market data continues to be a crucial challenge for risk departments and can prove a waste of valuable quant time on data formatting and cleaning.

Regulators want to see good data management and provenance in order to trace where a result comes from. Therefore, a more structural approach to market data sourcing, quality management and any market data operations is vital.

Developing a common data understanding

There may be many complications around regulatory change management, but there is also a range of tools available to assist, from developing a cross-referencing and mapping strategy to developing smart sourcing practices. Banks and their suppliers need to overcome the inherent difficulties and collaborate, using aligned data dictionaries. Internal agreement on terminology can often be challenging enough, with a lack of standardisation for acronyms or between desks often being the source of data inconsistency. Semantics are critical to creating a single version of the truth, which is why data governance has become an industry in its own right.

Regulators are becoming entrenched in data detail. Evidence of this, aside from FRTB, is in the rule-making of Dodd-Frank and the technical standards of Markets in Financial Instruments Directive II (MiFID II). This brings us to the importance of risk and finance in the whole process, the largest stakeholders for data governance and data quality. Positioned at the information convergence point, they ensure each strand is accurate in itself and allocated correctly. As such, they must lead the way when it comes to good data practices.

A bank’s ability to live up to the expectations of the risk data aggregation and reporting principles within BCBS 239 plays a pivotal role through the provision of transparency and integration capabilities. The specific reports, metrics and regulatory destinations may differ, but much of the input is the same. Good infrastructure establishes common ground for regulatory mandates, whether BCBS 239, FRTB or MiFID II, and as a result, it becomes clear what external data is required for the net sum of regulation.

The need for strong data infrastructure

As organisations scramble to adjust their practices to suit the latest regulatory requirements, whether financial services firms or third-party vendors, an overriding theme is emerging: data infrastructure. Taking everything into consideration from a bank’s perspective, creating robust data infrastructure is the best place to start when preparing for regulation. A solid foundation for compliance with the new market risk framework of FRTB, inclusive of prices, traded prices, quotes, risk factors and quality assessments, is a precondition of effective change management. This is relevant for adjacent regulations such as MiFID II and Packaged Retail and Insurance-based Investment Products (PRIIPs), but also for the business overall.

First and foremost, banks should look at implementing best practices in the collection, cross-referencing and integration of data, before moving on to look at data quality workflows, such as controlling and tracking proxies. Joined-up data may be the most significant shared aim of regulatory regimes – but it is yet to be adequately addressed.

Clarifying the need for data cohesion includes the requirement for a common data model for product terms and conditions, prices and risk factors, as well as business rules for look-up and classification work required. At the same time, banks need the ability to configure distributions to risk and valuation systems, and to use dashboards to monitor both volumes of data and suspects, and track the use of proxies or other actions taken on data. This is to complete the information supply chain before the data hits consuming systems.

Conclusion

Ultimately, the most effective way to manage regulatory change is through the accurate collection, controlled sourcing, cross-referencing and integration of data as a foundation. This addresses common regulatory demands around taxonomies, classifications, unambiguous identification, additional data context, links between related elements and general requirements on audit and lineage.

Compliance with modern financial services regulation cannot be treated as a box ticking exercise. To avoid regulatory change management taking over other projects, firms need to get their data management capabilities in order first – and taking the risk out of their risk data is a critical step in that process.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to maximise the use of data standards and identifiers beyond compliance and in the interests of the business

Date: 18 July 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data standards and identifiers have become common currency in regulatory compliance, bringing with them improved transparency, efficiency and data quality in reporting. They also contribute to automation. But their value does not end here, with data standards and identifiers...

BLOG

EMIR Refit at T-minus 60: What’s Expected of Regulated Firms?

With the EMIR Refit deadline less than 60 days away for entities within the EU and the UK’s own version set to launch in September, firms are expected to be in the advanced stages of preparation. By now, firms should be close to completion of the Trade Repositories’ six-month User Acceptance Testing (UAT) phase, which...

EVENT

ESG Data & Tech Summit London

The ESG Data & Tech Summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...