Great session on transparency in OTC securities at Andaz this week at a breakfast hosted by Interactive Data and featuring a practitioner panel moderated by our own Angela Wilbraham. It’s always refreshing to hear a frank exchange of views, and the panel – including BNP Paribas’ Peter Nowell and HSBC’s Chris Johnson – gave an attentive audience a thorough working through of the transparency issues within OTC markets.
In so doing, the panel hit on the base conundrum facing those tasked with pricing illiquid or hard-to-price securities – like structured products or even off-the-runs – and the suppliers whose role it is to help them.
A survey of clients conducted by Interactive Data highlighted some of the key drivers that explain why transparency has become such a hot potato. High on the list are risk management and client demands for greater understanding of the methodology behind an evaluated price. There is significant variation in the types of inputs, assumptions and models that make up an evaluated price. The range of variation is well illustrated by the hierarchy of securities within accounting rules, like IFRS and Topic 820.
Back in the day – Pre-Lehman, that is – the holders of securities enjoyed a stable if not perfect situation for pricing the full range of instruments on their books. Under various local accounting regulations, Level I securities – equities and other exchange-traded instruments – derived their prices pretty much in real time from exchange-based activities: quoting, buying and selling. A data feed.
For Level II securities – those not trade on exchanges but covered by aggregation like Xtraktr here in London or perhaps Finra Trace in the US – there was universal acceptance, by regulators, auditors and clients, of the validity of those services’ information.
And finally, for Level III securities – those OTC instruments that lack the liquidity to generate a meaningful market price on a regular basis – fund managers, their administrators and their brokers devised sophisticated models for coming up with what they considered to be a reasonable or fair value.
Today, the base landscape is the same. But at the Level III end of the curve, things are different. Those sophisticated models being used to value hard-to-price securities are under such intense scrutiny that firms are seeking to recategorise the securities they hold wherever possible, shifting them up the transparency curve toward Level I qualification as best they can.
This new spotlight is the obvious outcome of the much increased profile of instrument pricing. Pricing policies were established or rewritten across the buy-side in the period 2004-2008 which helped to ensure the governance was in place to manage the pricing of illiquid securities. Pricing teams routinely face scrutiny – including face-to-face meetings – from Auditors and Trustees and clients as part of the governance process.
Furthermore the volume of price exceptions has increased in the last three months and are close to the high levels previously experienced in late 2008 and early 2009. Alignment is needed between Fund Administrators on the preferred pricing hierarchies that are used, particularly for bonds, to help create price consistency across the buy-side.
In some markets, the age-old practice of simply rolling over a price from a previous instance is dying out, with practitioners preferring to reduce the number of securities they routinely price rather than provide a stale price. That makes the work of valuations service providers like Interactive Data that much harder, with fewer securities covered by bank pricing teams.
But where there’s muck there’s brass, and Interactive Data has high hopes that its Vantage service – with its newly added Finra Trace data – will go some way toward meeting the needs of hard-pressed pricing teams.
With regulators and others breathing down pricing teams’ necks, pressure for greater transparency seems to be translating into a long, hard slog for those responsible for assigning fair value on the world’s portfolios of OTC instruments. It’s some responsibility and a thankless task. And it could be unsustainable in the long term.
What’s the solution?
The Holy Grail would appear to be a single price based on generally accepted processes for coming up with that price. Then, when there’s a challenge or conflict between different valuations data sources, practitioners want to be able to drill down into the detail of the pricing methodology to reach a view.
Given the growing number of valuations suppliers coming to market, this Holy Grail is unlikely to be achieved anytime soon. There was talk of an effort among fund administrators to establish a set of best practices or accepted processes for pricing. We’ll look into how this is proceeding and report back soon.
In the meantime, however, fund administrators and their clients will have to put in the hard work required to deal with multiple valuations sources, and provide the transparency of methodology that satisfies everyone’s due diligence needs.
Providers like Interactive Data are stepping in to offer the deep-dive analysis of supporting data for any and all valuations. The challenge will be how – in these times of pressure on pricing teams to reduce the number of sources they use – to make the offering commercially compelling for those who don’t want all the data all of the time.