About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Redefining VaR and Multi-Dimensional Analytics

Subscribe to our newsletter

By Georges Bory, managing director and co-founder, Quartet FS

New regulations, such as Basel III, are changing the banking landscape and putting additional pressures on banks. These include more stringent requirements for risk modelling and ratios so they are better equipped to respond to changing market conditions. This puts measures of market risk, particularly Value at Risk (VaR), under scrutiny, forcing banks to review their approach and ensure they can meet the new demands.

In particular, the new landscape has created more complex curves and volatile market data. These drive market fluctuations that could expose banks to a variety of risks, whether from changes to currency rates, interest rates or other market factors.

Facing up to VaR

VaR has had a chequered history and been the subject of some criticism in recent years. Having emerged in the 1980s, it became an industry standard and regulators adopted it as the basis for capital modelling. However, the financial crisis severely dented its reputation. Part of the problem was that some applications of VaR proved inadequate, with VaR forecasts falling short of the actual losses in the greatly stressed market. Other criticisms have focused on perceived limitations, such as its application in measuring tail risk.

The real issue is how VaR is typically calculated. When it comes to making difficult decisions based on complex events, end-of-day data and overnight VaR calculations are no longer sufficient. For VaR to become a truly reliable measure in today’s market, trading and risk operations should revisit the underlying technology foundation – and at the heart of the answer, as is often the case in risk management, is data.

The data challenge

With regulations and market conditions creating more demanding requirements, banks must provide risk exposures on demand. Crucially, these must be in real-time and based on the most up-to-date data. However, one hurdle in this process is that trading and risk operations are often dispersed across multiple trading floors and divisions. This includes the middle and back office functions, most notably risk management. Many systems fail to cope with managing and storing vast amounts of data from these disparate systems in order to calculate VaR and deliver fast query response time.

Banks must feed large amounts of fast-moving data from separate sources and provide a single view that is updated on-the-fly. They must also simultaneously handle the different data sets generated by multiple processes, including stress-testing and Monte Carlo simulations.

Tackling non-linear calculations

Aggregating data is one part of the puzzle. Another is how that data is stored, updated and accessed for real-time risk modelling. When it comes to analytics, banks must move from per-trade VaR measurements and address the real non-linearity of VaR calculations. Therefore, a key challenge is the difficulty of aggregating these non-linear calculations and doing them incrementally.

This is particularly important in order to equip trading desks with the timely information they require as they perform trades throughout the day. With each new activity, traders need to see the impact immediately so they can make timely and complex decisions. These calculations include on-demand computations from pre-aggregated profit and loss (PnL) vectors, and complex and customised measures such as component VaR.

The shortfall of legacy systems

In the past, banks often relied on disk-based solutions. However, these legacy systems were too slow, failing to perform even the simplest of queries quickly enough. As a result, many moved to online analytical processing (OLAP) solutions.

OLAP solutions store data in a ‘cube’. Whenever a bank makes a new trade and the risk is analysed, information is instantly added to the cube. It then updates all the computations and informs the user if the new data has any impact upon the trade. However, the problem with the traditional application of OLAP was lack of performance. Previous generations of OLAP simply collapsed when too many dimensions were added. This made it largely redundant for applications that required large vectors – such as VaR.

The evolution of risk management

A new generation of in-memory computing has emerged in the financial services sector. This takes all the advantages of the older OLAP cube and addresses the all important performance issue. In-memory computing is answering many of the problems legacy systems struggled with – most notably the complexity of data and multi-dimensional analytics.

Banks’ data requirements have become more challenging. That means they need a way to manage data that can incorporate an unlimited number of sources. In-memory computing is a way to realise true multi-dimensional analytics. As such, banks are better equipped to navigate in their data and interact with it without any limitation to the depth and breadth of analysis.

The key difference with today’s in-memory computing is that it can provide new levels of computation speed. This works natively with complex mathematical objects such as the large vectors required for VaR.

This approach can deliver not only VaR calculations but also marginal VaR, tail analysis and stress scenario simulation. In particular, banks can exclude or add ‘what-if’ trades directly into the cube, enabling them to more accurately determine their risk and manage ex-ante ‘what-if’ predictions. These include the riskiness of the tail, the VaR impact of a specific trade and the impact of scenarios such as a market crash or an act of nature. In this way, it answers many of the earlier criticisms of VaR’s possible short-comings.

Banks have had to face some tough decisions around risk management and there are likely more pressures to come. In a more complex market, the ultimate goal needs to be real-time VaR and trading risk aggregation for the trading floor – plus PnL reconciliation produced on-demand directly from trading risk, also in real-time. If banks can calculate VaR and stress test scenarios on-the-fly, they can elevate risk management beyond a mere reporting measure, delivering a tool for rapid decision-making.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Proactive RegTech approaches to fighting financial crime

Financial crime is a global problem that costs the economy trillions of dollars a year, despite best efforts by financial services firms, regulators, and governments to stem the flow. As criminals become more sophisticated in how they commit financial crime, so too must capital markets participants working to challenge criminality and secure the global financial...

BLOG

How 2024 will be a Monumental Year with Evolving Regulatory Requirements

By Leo Labeis, CEO at REGnosys. This year will be uniquely busy with numerous changes to global reporting regimes. This article explores the changes firms need to be aware of and how RegTech solutions can help them stay ahead of the curve. RegTech is one of the fastest advancing areas of fintech with the global...

EVENT

TradingTech Summit London

Now in its 13th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...