About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

OpenGamma Includes Redis In-Memory Technology in Margining Platform

Subscribe to our newsletter

OpenGamma has the Redis in-memory data management technology with its own analytics and risk engine to create a margining platform offering. The new platform is designed to support fast and precise calculations, allowing clearing members to meet the initial margin calculation requirements of central counterparties on OTC swap transactions.

OpenGamma developed the new margining platform in response to structural changes in the OTC derivatives market, including the mandate that all transactions must be centrally cleared rather than settled bilaterally, a change that results in the requirement for initial margin to be posted on all transactions.

The platform combines the real-time risk engine of the open-source OpenGamma Platform with the Redis in-memory data store – which was developed by VMware and is maintained by Pivotal Software – to support replication of the initial margin and variation margin calculation methodologies that are used by central counterparties. To help clearing members avoid multiple integrations with central counterparties, the platform includes a single application programming interface to provide access to all supported central counterparty margin methodologies.

Based on the speed of calculation supported by in-memory data management, the platform can provide intraday re-margining of portfolios and respond to swap execution facility pings in real time.

Mas Nakachi, CEO of OpenGamma, says: “We developed the margining platform by enhancing the value at risk process supported by our risk engine and working with market participants to make calculations faster and more precise. The in-memory technology is key to real-time processing and the speed of making intraday initial margin calculations.”

The margining platform is designed for on-premise use and is being piloted with a number of clearing houses and clearing member firms. Once the platform is in production, Nakachi expects clearing members to offer some capabilities of the solution, such as the ability to analyse the margin and collateral impact of proposed trades and conduct what-if scenario analysis, to end clients as value added tools.

Looking forward and wearing his open source hat, Nakachi expects open source solutions to continue to gain traction as more markets become electronic.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and reduce costs. Use cases of specific importance to the finance sector, such as data...

BLOG

PE Deal Failures Highlight Importance of Private Data, Says JMAN Group

The critical importance of data to the private equity and alternatives markets sector is starkly underlined by an observation from Anush Newman, chief executive and co-founder of JMAN Group. “In the past 18 months, I know of at least 20 acquisition deals that have fallen through because the target companies didn’t have enough data to...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Hosted/Managed Services

The on-site data management model is broken. Resources have been squeezed to breaking point. The industry needs a new operating model if it is truly to do more with less. Can hosted/managed services provide the answer? Can the marketplace really create and maintain a utility-based approach to reference data management? And if so, how can...