The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

A Web of Complexity

By Martijn Groot, vice president of product management, Asset Control

At the centre of financial services risk management and regulatory compliance is the convergence of data from a huge range of information silos, departments, product lines, customers and risk categories. Risk and finance are at the end of an extensive chain, examining the consolidated threads, uncovering issues and spotting patterns.

This is where many organisations run into difficulty, unable to resolve the enormous data integration task. Without good data management, obtaining the right information to provide to regulators is like trawling muddy waters, there will always be uncertainty as to the resulting catch and its quality.

Harmonies across regulatory data requirements

The banking sector remains in the midst of reform, with investment in risk governance and integration continuing to rise. The volume of regulatory alerts has risen sharply in recent years. In 2008, there were 8,704 regulatory alerts; in 2015, that figure jumped to over 43,000. This equates to a piece of news on new regulation, a standards update, a QIS, a policy document or a consultation every 12 minutes.

The biggest growth in jobs is in risk professionals to deal with successive pieces of regulatory alteration. Within these successive pieces, there are common themes: different product taxonomies and client classifications; unambiguous identification; additional data context; links between related elements; and, generally, requirements on audit and lineage. In addition to regulators examining the quality of risk information chains for BCBS 239, and stress testing programmes encouraging a reconsideration of processes across silos, failure to meet the requirements of the Fundamental Review of the Trading Book (FRTB) can subject banks to increased capital charges.

The sourcing and integrating of data

The arrival of a regulatory zero tolerance approach towards poor data management means banks need to source and integrate market data efficiently, derive and track risk factor histories, and manage data quality proactively. The back end, however, is only part of the story. Equally as important is managing the consumption and standardisation of source data, and controlling the process and distribution – with transparent dashboards for quality metrics and ease of reporting.

The importance of a consistent data model to anchor the processes and business rules should not be underestimated. Sourcing clean market data continues to be a crucial challenge for risk departments and can prove a waste of valuable quant time on data formatting and cleaning.

Regulators want to see good data management and provenance in order to trace where a result comes from. Therefore, a more structural approach to market data sourcing, quality management and any market data operations is vital.

Developing a common data understanding

There may be many complications around regulatory change management, but there is also a range of tools available to assist, from developing a cross-referencing and mapping strategy to developing smart sourcing practices. Banks and their suppliers need to overcome the inherent difficulties and collaborate, using aligned data dictionaries. Internal agreement on terminology can often be challenging enough, with a lack of standardisation for acronyms or between desks often being the source of data inconsistency. Semantics are critical to creating a single version of the truth, which is why data governance has become an industry in its own right.

Regulators are becoming entrenched in data detail. Evidence of this, aside from FRTB, is in the rule-making of Dodd-Frank and the technical standards of Markets in Financial Instruments Directive II (MiFID II). This brings us to the importance of risk and finance in the whole process, the largest stakeholders for data governance and data quality. Positioned at the information convergence point, they ensure each strand is accurate in itself and allocated correctly. As such, they must lead the way when it comes to good data practices.

A bank’s ability to live up to the expectations of the risk data aggregation and reporting principles within BCBS 239 plays a pivotal role through the provision of transparency and integration capabilities. The specific reports, metrics and regulatory destinations may differ, but much of the input is the same. Good infrastructure establishes common ground for regulatory mandates, whether BCBS 239, FRTB or MiFID II, and as a result, it becomes clear what external data is required for the net sum of regulation.

The need for strong data infrastructure

As organisations scramble to adjust their practices to suit the latest regulatory requirements, whether financial services firms or third-party vendors, an overriding theme is emerging: data infrastructure. Taking everything into consideration from a bank’s perspective, creating robust data infrastructure is the best place to start when preparing for regulation. A solid foundation for compliance with the new market risk framework of FRTB, inclusive of prices, traded prices, quotes, risk factors and quality assessments, is a precondition of effective change management. This is relevant for adjacent regulations such as MiFID II and Packaged Retail and Insurance-based Investment Products (PRIIPs), but also for the business overall.

First and foremost, banks should look at implementing best practices in the collection, cross-referencing and integration of data, before moving on to look at data quality workflows, such as controlling and tracking proxies. Joined-up data may be the most significant shared aim of regulatory regimes – but it is yet to be adequately addressed.

Clarifying the need for data cohesion includes the requirement for a common data model for product terms and conditions, prices and risk factors, as well as business rules for look-up and classification work required. At the same time, banks need the ability to configure distributions to risk and valuation systems, and to use dashboards to monitor both volumes of data and suspects, and track the use of proxies or other actions taken on data. This is to complete the information supply chain before the data hits consuming systems.

Conclusion

Ultimately, the most effective way to manage regulatory change is through the accurate collection, controlled sourcing, cross-referencing and integration of data as a foundation. This addresses common regulatory demands around taxonomies, classifications, unambiguous identification, additional data context, links between related elements and general requirements on audit and lineage.

Compliance with modern financial services regulation cannot be treated as a box ticking exercise. To avoid regulatory change management taking over other projects, firms need to get their data management capabilities in order first – and taking the risk out of their risk data is a critical step in that process.

Related content

WEBINAR

Upcoming Webinar: Best practices for metadata management

Date: 5 October 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Metadata has become central to financial firms as a means of enriching, discovering and effectively using both business and technical data. Its management is equally important, particularly in capital markets applications such as data lineage and data governance, which...

BLOG

Newly Merged Calypso and AxiomSL to Rebrand as Adenza

Regtech AxiomSL and trading, treasury and risk management platform Calypso Technology – which merged in July – have rebranded under the name Adenza, joining the likes of Finastra (formerly Misys), Alveo (formerly Asset Control) and others in settling on a synthetic term under which to operate as a fintech going forward. According to the company,...

EVENT

TradingTech Summit Virtual

TradingTech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...