The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Transparency, Granularity of Data and Coverage are Key Issues in Current Market, Says Valuations Vendor Community

The financial crisis has resulted in a flurry of activity in the valuations vendor community to take into account the market requirements for increased transparency and greater granularity of data, agreed speakers at last month’s Valuation and Risk 2009 conference in London. Ian Blance, industry consultant and conference chair, told delegates that it is no longer acceptable to devolve the valuations function completely to a single third party provider and financial institutions are now being compelled to prove that they understand the modelling behind the prices.

“Regulations such as FAS 157 are driving the requirement for increased transparency and independence around valuations,” said Blance. “Firms must take in additional pricing feeds and retain knowledge in-house to understand where the prices have been derived from.”

Despite the compulsion for spending in this area, the vendor community has not had it easy, he continued: “Cost pressures are also leading to a degree of frustration with vendors because firms want more out of their providers for the same money. This has led to the advent of many alliances and acquisitions occurring across the vendor community in order to meet the demands of the market.”

The need for more data is being driven by risk management requirements and net asset value (NAV) calculations are increasing in frequency and in intensity of usage, added Chris Sier, director of Alpha Financial Markets Consulting. The hedge fund space in particular is looking for data that is reliable from its counterparties and there is more push back going on in the market to find out where the numbers are being derived from.

Peter Cotton, CEO of credit derivatives specialist vendor Julius Finance, seconded this notion: “There is a lot of investment going in to providing more transparency at the moment and, to this end, we’ve tried to provide the underlying simulation for valuations and add intellectual rigour to the process. This also results in an arbitrage free process.”

Other panellists on the models and analytics panel highlighted their own attempts to provide more insight into the analytics and modelling around the valuations that they provide. Eric Benhamou, CEO of Pricing Partners, discussed his firm’s endeavours to educate its clients about the details of the models that are used. The vendor launched a source code solution and development platform for its flagship derivatives pricing and risk management solution Price-it in April, with a view to providing greater transparency. “The provision of this data in scripting language means that our solution is not a black box, there is full transparency into the methodology,” he claimed.

Rohan Douglas, CEO of Quantifi Solutions, added that firms are looking for a greater level of transparency to provide comfort that the models being used are “reasonable”. “They need to match what the market is doing, for example match what JPMorgan’s credit default swap (CDS) pricing engine is doing,” he explained. At the start of the year, JPMorgan agreed to hand over its proprietary CDS pricing engine to the International Swaps and Derivatives Association (ISDA) as an open source platform with a view to improving transparency in the market and deterring regulators from stepping in.

Sier contended that as well as developing their own models, financial institutions are also investing in third party valuations solutions in order to emphasise their independence and transparency. However, Blance pointed to an A-Team Group survey conducted in September last year, which indicated that the more complex an asset type, the less likely respondents were to be using a third party valuations provider. “There is still a lack of penetration of commercial valuations models in certain corners of the market, although this is likely to change over time,” said Blance.

The strategic decision making behind valuations is also gradually changing as a result of its high profile, agreed the panel. The pressure for daily margining and collateral management has meant that more pricing data is required at a faster pace and the current ‘right size’ teams in place in most firms are no longer adequate.

Quantifi’s Douglas commented: “There has always been the issue of different parts of a bank requiring different data sets and models and it is a real challenge to consolidate these because of the often competing requirements between departments. For example, to some areas independence is paramount, whereas other areas are not so concerned with this and need greater speed of information.”

Most vendor panellists seemed to indicate that their own strategies to deal with the pressures of the current market climate centre on a partnership approach. Pricing Partners’ tie up with NYSE Euronext owned Prime Source is a case in point. Benhamou explained that the vendor is keen to partner because of its position as a niche vendor firm with a relatively small distribution network. “By partnering with larger firms we are able to increase our distribution network and add value to our customers by partnering with analytics vendors such as Misys for its Summit solution set,” he said.

A number of panellists on the cash market evaluations panel, on the other hand, indicated that their strategy in the current tough climate is to react to market demand. Lydia Galasean, quantitative analyst for SIX Telekurs, indicated that the vendor would introduce new coverage and functionality once its user community has requested it. “It is a completely demand driven process,” she explained.

Fellow panellist Peter Jones, global head of valuations scenario services for Standard & Poor’s, contended that this was not the tactic that his own firm has adopted. “The market isn’t sure what it needs at the moment and the vendor community is there to lead them to the solutions that are most appropriate to meet the requirements of the market,” he explained. “The Fixed Income Risk Management Services (FIRMS) business of S&P is dedicated to define these requirements.”

Whatever the approach, the conference indicated that the vendor community is certainly busy at the moment, responding directly to market demands, forging partnerships, pre-empting its user requirements, or all three.

Related content


Upcoming Webinar: Brexit: Reviewing the regulatory landscape and the data management response

Date: 11 May 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes With Brexit behind us and the UK establishing its own regulatory regime having failed to reach equivalence with the EU, financial firms face challenges of double reporting, uncertainty about UK regulation, and a potential exodus of top talent. The...


DSB Managing Director Emma Kalliomaki Outlines Issues in Progressing the UPI

As the Derivatives Service Bureau’s (DSB’s) first round of industry consultation on fee principles for the Unique Product Identifier (UPI) plays out, we talked to Emma Kalliomaki, managing director of ANNA and the DSB, about some of the issues involved in getting the UPI up, running and ready for its Q3 2022 release. Early conversations...


RegTech Summit New York City

Now in its 5th year, the RegTech Summit in NYC explores how the North American financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.


Practical Applications of the Global LEI – Client On-Boarding and Beyond

The time for talking is over. The time for action is now. A bit melodramatic, perhaps, but given last month’s official launch of the global legal entity identifier (LEI) standard, practitioners are rolling up their sleeves and getting on with figuring out how to incorporate the new identifier into their customer and entity data infrastructures....