About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

OpenGamma Includes Redis In-Memory Technology in Margining Platform

Subscribe to our newsletter

OpenGamma has the Redis in-memory data management technology with its own analytics and risk engine to create a margining platform offering. The new platform is designed to support fast and precise calculations, allowing clearing members to meet the initial margin calculation requirements of central counterparties on OTC swap transactions.

OpenGamma developed the new margining platform in response to structural changes in the OTC derivatives market, including the mandate that all transactions must be centrally cleared rather than settled bilaterally, a change that results in the requirement for initial margin to be posted on all transactions.

The platform combines the real-time risk engine of the open-source OpenGamma Platform with the Redis in-memory data store – which was developed by VMware and is maintained by Pivotal Software – to support replication of the initial margin and variation margin calculation methodologies that are used by central counterparties. To help clearing members avoid multiple integrations with central counterparties, the platform includes a single application programming interface to provide access to all supported central counterparty margin methodologies.

Based on the speed of calculation supported by in-memory data management, the platform can provide intraday re-margining of portfolios and respond to swap execution facility pings in real time.

Mas Nakachi, CEO of OpenGamma, says: “We developed the margining platform by enhancing the value at risk process supported by our risk engine and working with market participants to make calculations faster and more precise. The in-memory technology is key to real-time processing and the speed of making intraday initial margin calculations.”

The margining platform is designed for on-premise use and is being piloted with a number of clearing houses and clearing member firms. Once the platform is in production, Nakachi expects clearing members to offer some capabilities of the solution, such as the ability to analyse the margin and collateral impact of proposed trades and conduct what-if scenario analysis, to end clients as value added tools.

Looking forward and wearing his open source hat, Nakachi expects open source solutions to continue to gain traction as more markets become electronic.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to organise, integrate and structure data for successful AI

Artificial intelligence (AI) is increasingly being rolled out across financial institutions, being put to work in applications that are transforming everything from back-office data management to front-office trading platforms. The potential for AI to bring further cost-savings and operational gains are limited only by the imaginations of individual organisations. What they all require to achieve...

BLOG

Gaining a Holistic View of the Modern Investment Portfolio: Webinar Preview

The economic landscape has been transformed in recent years by a combination of technological upheavals, rising cost pressures on financial institutions and a rewriting of geopolitical and trading norms. All of these have inevitably led financial institutions to reconfigure their operations and the data processes on which they depend. The next A-Team Group Data Management...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...