About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Virginie’s Blog – The View from the Top

Subscribe to our newsletter

Ex-Financial Services Authority (FSA) head and British economist Howard Davies’ comments at this week’s SunGard City Day in London highlighted the seriousness with which firms should be treating Basel III, even if full implementation is eight years away. Davies called the new incoming regulatory standard on bank capital adequacy and liquidity “the most significant change ahead of the financial services industry in the next few years,” noting that its “onerous” requirements could put many out of business if investments in risk and capital management are not made in time.

Now, concern about the potential impacts of the latest iteration of the Basel Directive is nothing new, but the seriousness with which firms are treating the new requirements seems to be increasing. There is, for example, increased awareness that firms will need to keep a much closer eye on their capital in order to not just stay off of the regulators’ hit lists, but to stay in business.

Fundamentally, Basel III is aiming to raise the quality, consistency and transparency of firms’ capital bases, which as well as alterations to the makeup of these stores, also means the provision of more supporting data about these instruments. New regulatory reports and increased data transparency will obviously require technology investments, and firms are slowly beginning to wake up to this dynamic. Data management solution vendors have noted an increased interest in EDM platforms to this end (on a tactical basis, largely) – they may not be investing yet, but there seems to be some understanding that they may have to soon.

The enhancement of risk coverage is also a focus of Basel III that will mean stress testing will factor much more in risk modelling and analytics. Much the same as liquidity risk, firms will need to use stressed inputs and include factors such as “wrong way risk”, correlation multipliers and centralised exchange incentives (as the regulatory community continues in its crusade to force more instruments onto exchanges and via central clearers) in their calculations.

The new requirement for the inclusion of pro-cyclical factors, or forward looking provisioning, will also involve the implementation of a new type of risk modelling. Firms will not only have to factor in incurred losses but also “expected” losses using an approach based on additional risk modelling and case study based assessments. All of these advanced risk modelling approaches will require a sturdy data foundation and full data audit trail.

For his part, Davies noted the danger that the regulatory community poses to the industry in setting these new capital and liquidity requirements. A fine balance needs to be achieved between requiring firms to hold more capital (just in case) and giving them enough leeway to be able to stay in business. He noted that “imaginative and creative” use of capital will therefore be required in the future, all of which further supports the need for a good handle on firms’ data.

However, all of these global developments are, in turn, due to be filtered through regional and national regulatory lenses, which further complicates things. Davies noted the tension between national and global regulators, the increased involvement of politicians directly in financial regulation (for the worse) and the increasingly complicated “regulatory spider’s web,” at the centre of which sits the Financial Stability Board (FSB).

One recent example of the regulatory disconnect has been the discussions about the criteria for determining which firms should be categorised as systemically important financial institutions (SIFIs), for example (see my coverage of which here).

This structural complexity and political tension is exacerbating market uncertainty and causing many firms to hold off implementation. Providing some level of transparency and certainty with regards to future requirements would therefore help overcome this hurdle of market inaction. However, given the political motivations underlying many of the discussions at the top, this unstable ground is likely to continue for some time to come.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Detecting and preventing market abuse

Market abuse – unlawful disclosure of inside information, insider trading, circular trading, “pump and dump” schemes, etc. – poses significant threats to the integrity of capital markets. In 2024, global trading house Trafigura agreed to pay a $55 million fine to the U.S. Commodity Futures Trading Commission (CFTC) for trading with non-public information, manipulating a...

BLOG

Generali-Natixis Tie-up Highlights Data and Operational Complexities of Asset Management M&A

By Jeremy Katzeff, head of buy-side solutions at GoldenSource. After much speculation, it’s now confirmed. The asset management industry welcomes another mega fund to its ranks after the tie-up between the asset management businesses of Natixis and Generali Group. The reasons behind the merger are the same as they have been for the last few...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...