About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Divisions Between Market and Reference Data Causing Issues for Transparency, Says GoldenSource

Subscribe to our newsletter

A major hurdle to achieving a consistent view of pricing, positions and exposure lays in the divide between market data feeds and reference data repositories, according to GoldenSource, a provider of enterprise data management (EDM) solutions.

“By definition, enterprise risk management and reporting should go front to back as well as across assets,” says Gert Raeves, vice president of strategic business development at GoldenSource. “A trusted data environment for interdependent front and back office processes now relies on the centralisation of market and reference data. With market data, there is a convergence point where a snapshot of the market can be captured and validated for downstream applications.”

Pricing mechanisms in the front office and valuations for P&L in the back office often depend on the same price and rate data, yet both areas derive this information from disparate sources. According to research from consultancy A-Team Group, 87% of risk managers are interested in centralising market and reference data. This is to address issues including volatility, accuracy, consistency and the elimination of asset-based silos.

Unprecedented market movements are impacting portfolios, customers, counterparties and issuers faster than ever, meaning back office risk and P&L systems can no longer wait until end of day to access critical market data. Regulators are expecting much of the same data used in the trading decision process to be extended to valuation, P&L and risk management, thus bringing the same issues around accuracy and consistency to both data areas.

Regulatory and client demands for insight into how a value was derived means that golden copy pricing needs to extend beyond static reference sets and include multiple market data sources, says the vendor. Furthermore, siloed desks have traditionally been blamed for the inability to get a complete view of risk and exposure; these silos are now being broken down, yet the divide between market and reference data persists.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: GenAI and LLM case studies for Surveillance, Screening and Scanning

As Generative AI (GenAI) and Large Language Models (LLMs) move from pilot to production, compliance, surveillance, and screening functions are seeing tangible results – and new risks. From trade surveillance to adverse media screening to policy and regulatory scanning, GenAI and LLMs promise to tackle complexity and volume at a scale never seen before. But...

BLOG

The Data Year Ahead: AI Comes of Age, Private Markets Become Less Opaque

2026 is set to be the year in which the evolutionary changes hinted in the past 12 months become established within the data landscape, according to expert predictions. Artificial intelligence will mature into the game-changing innovation it has promised for years and private markets, whose growth in importance in the past few years has been...

EVENT

TradingTech Summit New York

Our TradingTech Summit in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

Entity Data Management

Entity data management has historically been a rather overlooked area of the reference data landscape, but with the increase focus on managing risk, the industry is finally taking notice. It is now generally agreed to be critical to every financial institution; although the rewards for investment in entity data management appear to be rather small,...