About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

OpenGamma Includes Redis In-Memory Technology in Margining Platform

Subscribe to our newsletter

OpenGamma has the Redis in-memory data management technology with its own analytics and risk engine to create a margining platform offering. The new platform is designed to support fast and precise calculations, allowing clearing members to meet the initial margin calculation requirements of central counterparties on OTC swap transactions.

OpenGamma developed the new margining platform in response to structural changes in the OTC derivatives market, including the mandate that all transactions must be centrally cleared rather than settled bilaterally, a change that results in the requirement for initial margin to be posted on all transactions.

The platform combines the real-time risk engine of the open-source OpenGamma Platform with the Redis in-memory data store – which was developed by VMware and is maintained by Pivotal Software – to support replication of the initial margin and variation margin calculation methodologies that are used by central counterparties. To help clearing members avoid multiple integrations with central counterparties, the platform includes a single application programming interface to provide access to all supported central counterparty margin methodologies.

Based on the speed of calculation supported by in-memory data management, the platform can provide intraday re-margining of portfolios and respond to swap execution facility pings in real time.

Mas Nakachi, CEO of OpenGamma, says: “We developed the margining platform by enhancing the value at risk process supported by our risk engine and working with market participants to make calculations faster and more precise. The in-memory technology is key to real-time processing and the speed of making intraday initial margin calculations.”

The margining platform is designed for on-premise use and is being piloted with a number of clearing houses and clearing member firms. Once the platform is in production, Nakachi expects clearing members to offer some capabilities of the solution, such as the ability to analyse the margin and collateral impact of proposed trades and conduct what-if scenario analysis, to end clients as value added tools.

Looking forward and wearing his open source hat, Nakachi expects open source solutions to continue to gain traction as more markets become electronic.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Are you making the most of the business-critical structured data stored in your mainframes?

Fewer than 30% of companies think that they can fully tap into their mainframe data even though complete, accurate and real-time data is key to business decision-making, compliance, modernisation and innovation. For many in financial markets, integrating data across the enterprise and making it available and actionable to everyone who needs it is extremely difficult....

BLOG

Overcoming Data Challenges of Rapidly Evolving ESG Space: ESG Data and Tech Briefing Preview

The rapid maturation of ESG data integration and utilisation within financial institutions has forced them to invest in new technology and data management processes. The rate of change, however, has been a challenge for some organisations, which have struggled to put in place the necessary capabilities to absorb, order and deploy such large volumes of...

EVENT

AI in Data Management Summit New York City

Following the success of the 15th Data Management Summit NYC, A-Team Group are excited to announce our new event: AI in Data Management Summit NYC!

GUIDE

Practical Applications of the Global LEI – Client On-Boarding and Beyond

The time for talking is over. The time for action is now. A bit melodramatic, perhaps, but given last month’s official launch of the global legal entity identifier (LEI) standard, practitioners are rolling up their sleeves and getting on with figuring out how to incorporate the new identifier into their customer and entity data infrastructures....