About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Redefining VaR and Multi-Dimensional Analytics

Subscribe to our newsletter

By Georges Bory, managing director and co-founder, Quartet FS

New regulations, such as Basel III, are changing the banking landscape and putting additional pressures on banks. These include more stringent requirements for risk modelling and ratios so they are better equipped to respond to changing market conditions. This puts measures of market risk, particularly Value at Risk (VaR), under scrutiny, forcing banks to review their approach and ensure they can meet the new demands.

In particular, the new landscape has created more complex curves and volatile market data. These drive market fluctuations that could expose banks to a variety of risks, whether from changes to currency rates, interest rates or other market factors.

Facing up to VaR

VaR has had a chequered history and been the subject of some criticism in recent years. Having emerged in the 1980s, it became an industry standard and regulators adopted it as the basis for capital modelling. However, the financial crisis severely dented its reputation. Part of the problem was that some applications of VaR proved inadequate, with VaR forecasts falling short of the actual losses in the greatly stressed market. Other criticisms have focused on perceived limitations, such as its application in measuring tail risk.

The real issue is how VaR is typically calculated. When it comes to making difficult decisions based on complex events, end-of-day data and overnight VaR calculations are no longer sufficient. For VaR to become a truly reliable measure in today’s market, trading and risk operations should revisit the underlying technology foundation – and at the heart of the answer, as is often the case in risk management, is data.

The data challenge

With regulations and market conditions creating more demanding requirements, banks must provide risk exposures on demand. Crucially, these must be in real-time and based on the most up-to-date data. However, one hurdle in this process is that trading and risk operations are often dispersed across multiple trading floors and divisions. This includes the middle and back office functions, most notably risk management. Many systems fail to cope with managing and storing vast amounts of data from these disparate systems in order to calculate VaR and deliver fast query response time.

Banks must feed large amounts of fast-moving data from separate sources and provide a single view that is updated on-the-fly. They must also simultaneously handle the different data sets generated by multiple processes, including stress-testing and Monte Carlo simulations.

Tackling non-linear calculations

Aggregating data is one part of the puzzle. Another is how that data is stored, updated and accessed for real-time risk modelling. When it comes to analytics, banks must move from per-trade VaR measurements and address the real non-linearity of VaR calculations. Therefore, a key challenge is the difficulty of aggregating these non-linear calculations and doing them incrementally.

This is particularly important in order to equip trading desks with the timely information they require as they perform trades throughout the day. With each new activity, traders need to see the impact immediately so they can make timely and complex decisions. These calculations include on-demand computations from pre-aggregated profit and loss (PnL) vectors, and complex and customised measures such as component VaR.

The shortfall of legacy systems

In the past, banks often relied on disk-based solutions. However, these legacy systems were too slow, failing to perform even the simplest of queries quickly enough. As a result, many moved to online analytical processing (OLAP) solutions.

OLAP solutions store data in a ‘cube’. Whenever a bank makes a new trade and the risk is analysed, information is instantly added to the cube. It then updates all the computations and informs the user if the new data has any impact upon the trade. However, the problem with the traditional application of OLAP was lack of performance. Previous generations of OLAP simply collapsed when too many dimensions were added. This made it largely redundant for applications that required large vectors – such as VaR.

The evolution of risk management

A new generation of in-memory computing has emerged in the financial services sector. This takes all the advantages of the older OLAP cube and addresses the all important performance issue. In-memory computing is answering many of the problems legacy systems struggled with – most notably the complexity of data and multi-dimensional analytics.

Banks’ data requirements have become more challenging. That means they need a way to manage data that can incorporate an unlimited number of sources. In-memory computing is a way to realise true multi-dimensional analytics. As such, banks are better equipped to navigate in their data and interact with it without any limitation to the depth and breadth of analysis.

The key difference with today’s in-memory computing is that it can provide new levels of computation speed. This works natively with complex mathematical objects such as the large vectors required for VaR.

This approach can deliver not only VaR calculations but also marginal VaR, tail analysis and stress scenario simulation. In particular, banks can exclude or add ‘what-if’ trades directly into the cube, enabling them to more accurately determine their risk and manage ex-ante ‘what-if’ predictions. These include the riskiness of the tail, the VaR impact of a specific trade and the impact of scenarios such as a market crash or an act of nature. In this way, it answers many of the earlier criticisms of VaR’s possible short-comings.

Banks have had to face some tough decisions around risk management and there are likely more pressures to come. In a more complex market, the ultimate goal needs to be real-time VaR and trading risk aggregation for the trading floor – plus PnL reconciliation produced on-demand directly from trading risk, also in real-time. If banks can calculate VaR and stress test scenarios on-the-fly, they can elevate risk management beyond a mere reporting measure, delivering a tool for rapid decision-making.

Subscribe to our newsletter

Related content


Upcoming Webinar: Best practices for compliance with EU Market Abuse Regulation

Date: 18 June 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes EU Market Abuse Regulation (MAR) came into force in July 2016, rescinding the previous Market Abuse Directive and replacing it with a significantly extended scope of regulatory obligations. Eight years later, and amid constant change in capital markets regulation,...


BNY Mellon and Microsoft Collaborate On Next-Generation Cloud-Based Data Management Products

BNY Mellon and Microsoft have extended a long-term relationship with a collaboration that will see the bank using multiple Microsoft technologies to develop next-generation data management and software products tailored to the needs of capital markets participants. The collaboration will accelerate the release of BNY Mellon’s data and analytics cloud-based Software-as-a-Service (SaaS) offering that will...


Buy AND Build: The Future of Capital Markets Technology, London

Buy AND Build: The Future of Capital Markets Technology London on September 19th at Marriott Hotel Canary Wharf London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.


Evaluated Pricing

Valuations and pricing teams are facing a much higher degree of scrutiny from both the regulatory community and the investor community in the glare of the post-crisis data transparency spotlight. Fair value price transparency requirements and the gradual move towards a more harmonised accounting standards environment is set within the context of the whole debate...