The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Opinion: Bridging the Counterparty Risk Gap

Share article

By Xavier Bellouard, co-founder at Quartet FS

With the European stress testing exercise this summer casting a further spotlight on the industry, financial institutions across Europe are continuing to grapple with risk management. In particular, counterparty risk exposure is under the spotlight after coming to the fore during the demise of Lehman’s. While the majority of banks passed the recent stress tests, few can rest on their laurels as regulators and investors continue to push for transparency and accountability.

The evolving regulatory landscape and increased need to monitor, measure and report, has advanced in such a way over the past three years that understanding and realising risk exposures in real-time has become crucial to a financial institution’s stability and success. Amongst other things, the financial crisis served to highlight the inadequacy of the technology that was traditionally in place to establish and manage counterparty risk in particular. A number of surveys released in the first half of 2010 have reiterated this and unanimously agreed that risk managers need to have better access to data, especially related to credit risk, in order to gain a thorough view of counterparty risk exposure.

With board accountability a major consideration in today’s market environment, forward thinking financial institutions are casting the net further and wider within their counterparty risk calculations. In order to leverage this broader insight, risk managers must not only overcome the challenge of the data deluge, but also have the appropriate tools in place to make scientific data accessible to ‘non quant’ users. Only by undertaking this second stage will they be able to analyse the data in real-time and make informed recommendations against it to the board.

While an element of risk is to be expected with every trade, the financial crisis served to highlight the need to be aware of the associated risk and be comfortable as a business with it. However, pinpointing where risk lies is no mean feat, with even the largest of institutions not fully aware of their exposures. With this in mind, better understanding and management of risk – especially counterparty – is crucial to the ongoing financial recovery and to re-injecting confidence back into the interbank lending market. As a result, risk managers are now tasked with being able to provide aggregated figures quickly and accurately in order to truly establish the position the business finds itself in.

Due to its interconnected nature, counterparty risk analysis must not only take into account various data, including VaR, P&L and sensitivities, but must also integrate a number of asset classes and draw together an understanding of the collective impact that they have. Once counterparty risk is accurately established, risk managers need to gain better access to, and analysis of, this data so they can more effectively bridge the gap between ‘quants’ and decision makers. For example, modern credit risk frameworks imply stochastic evaluation of risk, based mainly on the traditional Monte Carlo simulation. However, in order to truly manage risk, this scientific data must be made available to ‘non quant’ users so that they can act on it.

As with all things risk related, technology plays a significant part in not only collating information but in making it digestible to ‘non quant’ users. Alongside intuitive navigation to understand the raw data, ‘non quant’ risk managers want to be able to:

  • integrate sophisticated calculations such as future exposure, as well as complex netting and collateral rules (including the inherit correlation of the collateral vs. the specific counterparties);
  • go beyond single indicators that need to be put into perspective; and
  • have the ability to simulate the impact of various strategies for credit exposure, inclusive of any collateral held, in real time.

Ultimately, risk managers need to have the ability to communicate with management and collaborate on trading decisions in a credit sensitive environment. Banks’ traditionally siloed approach has meant that to date, this has been a pipedream rather than a reality. However, aggregating high volumes of data from multiple streams to produce both snapshots and the ability to drill down into the data in real time is possible by combining complex event processing (CEP) and online analytical processing (OLAP). The combination of the two technologies allows users to view their risk and profit and loss data in the way they want, while enjoying real-time/push technology that provides constant updates from market data as well as new and amended trades. In addition to providing a real-time view of exposure, made up by multiple data sources, firms can slice and dice information and analyse as required, drilling down to the smallest detail or building up to provide a top level overview.

If implemented and deployed appropriately, technology can avoid duplication of analysis and aid risk managers in making well informed decisions quickly that support the business. Whilst technology is by no means the universal remedy to solving all the issues associated with counterparty risk, the quicker a firm can realise its true counterparty positions and potential exposures in the context of the overall risk picture, the greater its overall market competitiveness and confidence will be.

Related content

WEBINAR

Upcoming Webinar: Entity identification and client lifecycle management – How financial institutions can drive $4 billion in cost savings

Date: 21 January 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes A new model in Legal Entity Identifier (LEI) issuance has created significant opportunities for financial institutions to capitalise on their KYC and AML due diligence. By becoming Validation Agents and obtaining LEIs on behalf of their clients, financial institutions...

BLOG

J.P. Morgan Steps Forward as First Validation Agent in Global LEI System

J.P. Morgan has become the first validation agent in the Global LEI System, signalling early interest in the scheme as a means of improving client onboarding and Know Your Customer (KYC) . The validation agent framework was introduced by GLEIF in September 2020 and is designed to help financial institutions improve customer experience, accelerate client...

EVENT

Data Management Summit USA Virtual

The highly successful Data Management Summit USA Virtual was held in September 2020 and explored how sell side and buy side financial institutions are navigating the global crisis and adapting their data strategies to manage in today’s new normal environment.

GUIDE

MiFID II Handbook

As the 3 January 2018 compliance deadline for Markets in Financial Instruments Directive II (MiFID II) approaches, A-Team Group has pulled together everything you need to know about the regulation in a precise and concise handbook. The MiFID II Handbook, commissioned by Thomson Reuters, provides a guide to aspects of the regulation that will have...