About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Knock on Effect

Subscribe to our newsletter

The close link between data quality and better risk management has come under the spotlight in recent years as a result of the financial crisis and the ensuing regulatory crackdown on financial markets. Regulators such as the UK Financial Services Authority (FSA) and the US Securities and Exchange Commission (SEC) have even been making noises about the importance of the data underlying risk calculations and therefore have been more frequently drilling down into this data for supervisory purposes. However, the biggest driver by far has been the advent of Basel III and the fundamental changes being introduced as part of this new regulation, which brings risk management into the equation right from the front office to the back.

Basel III is aiming to raise the quality, consistency and transparency of firms’ capital bases, which as well as alterations to the makeup of their tier one, two and three levels of capital, also means the provision of more supporting data about these instruments. New regulatory reports and increased data transparency will obviously result in technology investment and a large proportion of this will initially need to go towards data management infrastructure.

Moreover, the enhancement of risk coverage is also a focus of Basel III that will mean stress testing will factor much more in risk modelling and analytics. Much the same as liquidity risk, firms will need to use stressed inputs and include factors such as “wrong way risk”, correlation multipliers and centralised exchange incentives (as the regulatory community continues in its crusade to force more instruments onto exchanges and via central clearers) in their calculations. All of this essentially means an increase in the data sets that must be dealt with by those involved in the risk management function and hence more data to be kept clean.

This plethora of new requirements has led to discussions about enterprise risk management strategies and how firms can move from their currently siloed approach to the risk function to a much more integrated one. After all, one of the basic tenets of conducting an integrated risk management function is the connecting of the multiple siloed data sets that must be used to feed into risk calculations, both historical and real-time.

As noted recently by Tom Dalglish, director and chief information architect at Bank Of America Merrill Lynch, regulatory and risk management pressures are therefore compelling firms to plough investment into their data infrastructures in order to be able to ensure the consistency of the basic reference data underlying their businesses. “The pressing requirement on the regulatory front is the ability to provide consistent data across the firm, track data flows and usage, leverage consistent identifiers and provide an auditable chain of custody for data as it traverses the enterprise,” says Dalglish.

“Firms need to focus on what they are doing across lines of business to guarantee that all users are looking at consistent data and at the same time reduce duplicate storage, improve the process of data entitlement and authentication and increase the auditability of data. There are anticipated regulatory requirements for providing evidence of a data governance policy and traceability of data,” he continues. Many of these requirements are in-built into Basel III, which includes specific references to the ability for regulators to be able to drill down into risk calculations and determine the quality of data used.

However, the changes required to meet Basel III requirements will not come cheap: according to an estimate by UBS last year, firms may need to raise US$375 billion of fresh capital to comply with the new rules. It is assumed that Basel III will also be implemented in a much quicker manner than its predecessor, which suffered endless delays, due to the appetite for regulatory change at the moment and the fact that a lot of the groundwork has already been done with the Capital Requirement Directives (CRDs). Of course, this will also mean firms will need to get their new systems online and ready to produce the altered risk calculations in a much tighter timeline.

This is a challenge that should prove tough alongside the liquidity risk reporting changes and other widespread reforms sweeping the market. Accounting changes are also a risk related challenge that is compelling investment in data management. In the post-crisis environment, the expected losses provisioning approach to accounting, for example, requires the incorporation of a broader range of credit information, both quantitative and qualitative, which must be drawn from banks’ risk management and capital adequacy systems. This data must be transparent and subject to appropriate internal and external validation by auditors, supervisors and other constituents.

The focus of the work has been on improving the “relevance” and “usefulness” of accounting standards and has thus been centred on increasing transparency and therefore the depth of the data provided with pricing and valuations. Regulators and end clients will require firms to provide a sufficient level of data around how they achieved their pricing and figures for accounting; much like for risk management. The changes should also lend to the cause of the chief risk officer (CRO) in endeavouring to get a better handle on a firm’s financial position.

The close link between reliable, high quality data and accurate risk calculations is well understood; it is now up to the industry to take action.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: GenAI and LLM case studies for Surveillance, Screening and Scanning

6 November 2025 11:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes As Generative AI (GenAI) and Large Language Models (LLMs) move from pilot to production, compliance, surveillance, and screening functions are seeing tangible results — and new risks. From trade surveillance to adverse media screening to policy and regulatory scanning, GenAI and...

BLOG

From Panic to Progress: Tackling the Trade Surveillance Data Challenge

Travis Schwab, CEO, Eventus. Over the past year, global regulators have hit firms with massive fines for lapses in data governance. This has triggered a state of near-panic across trade surveillance teams as they scramble to avoid a similar fate. In fact, there has been little time for much else. As one head of surveillance...

EVENT

TradingTech Briefing New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...