By Xavier Bellouard, co-founder at Quartet FS
With the European stress testing exercise this summer casting a further spotlight on the industry, financial institutions across Europe are continuing to grapple with risk management. In particular, counterparty risk exposure is under the spotlight after coming to the fore during the demise of Lehman’s. While the majority of banks passed the recent stress tests, few can rest on their laurels as regulators and investors continue to push for transparency and accountability.
The evolving regulatory landscape and increased need to monitor, measure and report, has advanced in such a way over the past three years that understanding and realising risk exposures in real-time has become crucial to a financial institution’s stability and success. Amongst other things, the financial crisis served to highlight the inadequacy of the technology that was traditionally in place to establish and manage counterparty risk in particular. A number of surveys released in the first half of 2010 have reiterated this and unanimously agreed that risk managers need to have better access to data, especially related to credit risk, in order to gain a thorough view of counterparty risk exposure.
With board accountability a major consideration in today’s market environment, forward thinking financial institutions are casting the net further and wider within their counterparty risk calculations. In order to leverage this broader insight, risk managers must not only overcome the challenge of the data deluge, but also have the appropriate tools in place to make scientific data accessible to ‘non quant’ users. Only by undertaking this second stage will they be able to analyse the data in real-time and make informed recommendations against it to the board.
While an element of risk is to be expected with every trade, the financial crisis served to highlight the need to be aware of the associated risk and be comfortable as a business with it. However, pinpointing where risk lies is no mean feat, with even the largest of institutions not fully aware of their exposures. With this in mind, better understanding and management of risk – especially counterparty – is crucial to the ongoing financial recovery and to re-injecting confidence back into the interbank lending market. As a result, risk managers are now tasked with being able to provide aggregated figures quickly and accurately in order to truly establish the position the business finds itself in.
Due to its interconnected nature, counterparty risk analysis must not only take into account various data, including VaR, P&L and sensitivities, but must also integrate a number of asset classes and draw together an understanding of the collective impact that they have. Once counterparty risk is accurately established, risk managers need to gain better access to, and analysis of, this data so they can more effectively bridge the gap between ‘quants’ and decision makers. For example, modern credit risk frameworks imply stochastic evaluation of risk, based mainly on the traditional Monte Carlo simulation. However, in order to truly manage risk, this scientific data must be made available to ‘non quant’ users so that they can act on it.
As with all things risk related, technology plays a significant part in not only collating information but in making it digestible to ‘non quant’ users. Alongside intuitive navigation to understand the raw data, ‘non quant’ risk managers want to be able to:
- integrate sophisticated calculations such as future exposure, as well as complex netting and collateral rules (including the inherit correlation of the collateral vs. the specific counterparties);
- go beyond single indicators that need to be put into perspective; and
- have the ability to simulate the impact of various strategies for credit exposure, inclusive of any collateral held, in real time.
Ultimately, risk managers need to have the ability to communicate with management and collaborate on trading decisions in a credit sensitive environment. Banks’ traditionally siloed approach has meant that to date, this has been a pipedream rather than a reality. However, aggregating high volumes of data from multiple streams to produce both snapshots and the ability to drill down into the data in real time is possible by combining complex event processing (CEP) and online analytical processing (OLAP). The combination of the two technologies allows users to view their risk and profit and loss data in the way they want, while enjoying real-time/push technology that provides constant updates from market data as well as new and amended trades. In addition to providing a real-time view of exposure, made up by multiple data sources, firms can slice and dice information and analyse as required, drilling down to the smallest detail or building up to provide a top level overview.
If implemented and deployed appropriately, technology can avoid duplication of analysis and aid risk managers in making well informed decisions quickly that support the business. Whilst technology is by no means the universal remedy to solving all the issues associated with counterparty risk, the quicker a firm can realise its true counterparty positions and potential exposures in the context of the overall risk picture, the greater its overall market competitiveness and confidence will be.