About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

GoldenSource Releases Market Risk Factor Data Standard, Eases FRTB Compliance

Subscribe to our newsletter

GoldenSource, a provider of Enterprise Data Management (EDM) and Master Data Management (MDM) solutions, has created a market risk factor data standard. Called Curve Master Definitions, the standard seeks to provide investment banks with a single risk factor taxonomy for market rates required to price OTC derivatives, including the storage and aggregation of industry standard conventions required for quantitative processes. This includes yield curve building, volatility surface calculations, and industry standard interpolation methodologies.

The development of Curve Master Definitions was led by the company’s head of market data, quant and risk solutions, Charlie Browne, who is completing a PhD on the merits of taking a taxonomic approach to risk factors in derivative pricing.

He says: “This innovation has the potential to have an impact in the banking sector, including regulators and central banks that are responsible for its oversight. In addition to transforming the way market participants approach trading-book process alignment, the taxonomy could act as a useful tool for regulators and auditors to ensure there is commonality between the data set that underlies all trading book processes, namely market rates.”

The taxonomy allows financial institutions to take a common approach to reviewing and conforming to the trading book processes that are pre-requisites of the Fundamental Review of the Trading Book (FRTB) framework, which is due to be implemented on 1 January 2025 in most jurisdictions. These trading book processes include independent price verification, bid-ask reserving, marking to model, adjustments for illiquid positions, stress testing, internal model reviews, and interest rate risk in the banking book.

By providing a structured and consistent approach to defining market risk factors, GoldenSource’s Curve Master Definitions will offer users a more holistic view of their risk factors, as well as an accelerated approach for implementing the core GoldenSource Curve Master module that is designed to centralise and validate the market rates for curves and surfaces that form the set of a bank’s market risk factors.

Beyond helping firms align their FRTB and core trading book processes, having a standard taxonomy for market risk factors will allow firms to standardise their approaches to regulatory requirements, such as stress testing, internal model reviews and interest risk in the banking book.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Navigating a Complex World: Best Data Practices in Sanctions Screening

As rising geopolitical uncertainty prompts an intensification in the complexity and volume of global economic and financial sanctions, banks and financial institutions are faced with a daunting set of new compliance challenges. The risk of inadvertently engaging with sanctioned securities has never been higher and the penalties for doing so are harsh. Traditional sanctions screening...

BLOG

Sanctions Data Has Outgrown the Systems Built to Manage It

By Marion Leslie, Head of Financial Information, Executive Board Member, SIX. For as long as anyone in the industry can remember, sanctions in financial instruments representing holdings in sanctioned legal entities have been treated as a very specialist concern. They sat with compliance teams and were largely invisible to day-to-day market activity. The issue is...

EVENT

TEST Event page 1

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Hosted/Managed Services

The on-site data management model is broken. Resources have been squeezed to breaking point. The industry needs a new operating model if it is truly to do more with less. Can hosted/managed services provide the answer? Can the marketplace really create and maintain a utility-based approach to reference data management? And if so, how can...