About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Andrew’s Blog – The Hard Slog That Is Valuations

Subscribe to our newsletter

Great session on transparency in OTC securities at Andaz this week at a breakfast hosted by Interactive Data and featuring a practitioner panel moderated by our own Angela Wilbraham. It’s always refreshing to hear a frank exchange of views, and the panel – including BNP Paribas’ Peter Nowell and HSBC’s Chris Johnson – gave an attentive audience a thorough working through of the transparency issues within OTC markets.

In so doing, the panel hit on the base conundrum facing those tasked with pricing illiquid or hard-to-price securities – like structured products or even off-the-runs – and the suppliers whose role it is to help them.

A survey of clients conducted by Interactive Data highlighted some of the key drivers that explain why transparency has become such a hot potato. High on the list are risk management and client demands for greater understanding of the methodology behind an evaluated price. There is significant variation in the types of inputs, assumptions and models that make up an evaluated price. The range of variation is well illustrated by the hierarchy of securities within accounting rules, like IFRS and Topic 820.

Back in the day – Pre-Lehman, that is – the holders of securities enjoyed a stable if not perfect situation for pricing the full range of instruments on their books. Under various local accounting regulations, Level I securities – equities and other exchange-traded instruments – derived their prices pretty much in real time from exchange-based activities: quoting, buying and selling. A data feed.

For Level II securities – those not trade on exchanges but covered by aggregation like Xtraktr here in London or perhaps Finra Trace in the US – there was universal acceptance, by regulators, auditors and clients, of the validity of those services’ information.

And finally, for Level III securities – those OTC instruments that lack the liquidity to generate a meaningful market price on a regular basis – fund managers, their administrators and their brokers devised sophisticated models for coming up with what they considered to be a reasonable or fair value.

Today, the base landscape is the same. But at the Level III end of the curve, things are different. Those sophisticated models being used to value hard-to-price securities are under such intense scrutiny that firms are seeking to recategorise the securities they hold wherever possible, shifting them up the transparency curve toward Level I qualification as best they can.

This new spotlight is the obvious outcome of the much increased profile of instrument pricing. Pricing policies were established or rewritten across the buy-side in the period 2004-2008 which helped to ensure the governance was in place to manage the pricing of illiquid securities. Pricing teams routinely face scrutiny – including face-to-face meetings – from Auditors and Trustees and clients as part of the governance process.

Furthermore the volume of price exceptions has increased in the last three months and are close to the high levels previously experienced in late 2008 and early 2009. Alignment is needed between Fund Administrators on the preferred pricing hierarchies that are used, particularly for bonds, to help create price consistency across the buy-side.

In some markets, the age-old practice of simply rolling over a price from a previous instance is dying out, with practitioners preferring to reduce the number of securities they routinely price rather than provide a stale price. That makes the work of valuations service providers like Interactive Data that much harder, with fewer securities covered by bank pricing teams.

But where there’s muck there’s brass, and Interactive Data has high hopes that its Vantage service – with its newly added Finra Trace data – will go some way toward meeting the needs of hard-pressed pricing teams.

With regulators and others breathing down pricing teams’ necks, pressure for greater transparency seems to be translating into a long, hard slog for those responsible for assigning fair value on the world’s portfolios of OTC instruments. It’s some responsibility and a thankless task. And it could be unsustainable in the long term.

What’s the solution?

The Holy Grail would appear to be a single price based on generally accepted processes for coming up with that price. Then, when there’s a challenge or conflict between different valuations data sources, practitioners want to be able to drill down into the detail of the pricing methodology to reach a view.

Given the growing number of valuations suppliers coming to market, this Holy Grail is unlikely to be achieved anytime soon. There was talk of an effort among fund administrators to establish a set of best practices or accepted processes for pricing. We’ll look into how this is proceeding and report back soon.

In the meantime, however, fund administrators and their clients will have to put in the hard work required to deal with multiple valuations sources, and provide the transparency of methodology that satisfies everyone’s due diligence needs.

Providers like Interactive Data are stepping in to offer the deep-dive analysis of supporting data for any and all valuations. The challenge will be how – in these times of pressure on pricing teams to reduce the number of sources they use – to make the offering commercially compelling for those who don’t want all the data all of the time.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Mastering Data Lineage for Risk, Compliance, and AI Governance

18 June 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Financial institutions are under increasing pressure to ensure data transparency, regulatory compliance, and AI governance. Yet many struggle with fragmented data landscapes, poor lineage tracking and compliance gaps. This webinar will explore how enterprise-grade data lineage can help capital markets participants...

BLOG

November 2025 Deadline for ISO 20022: Are We Ready?

Global payment networks are undergoing a fundamental transformation as the financial industry transitions to ISO 20022 – a structured messaging standard designed to replace legacy formats and drive interoperability. In capital markets and treasury operations, this shift is most evident in the SWIFT cross-border payments network and high-value systems like the U.S. Fedwire. SWIFT’s Cross-Border...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...