The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Greater Importance Being Placed on Assumptive Data for Valuations, Agree DMRAV Panellists

Increased importance is being placed on providing the assumptive data inputs that are going into pricing and valuations calculations, according to panellists at last month’s A-Team Group Data Management for Risk, Analytics and Valuations (DMRAV) conference in NYC. This data is required as a result of changes to accounting rules related to fair value by both the Financial Accounting Standards Board (FASB) and the International Accounting Standards Board (IASB) in the post-crisis landscape.

Last month, the two bodies finally agreed on a global definition for “fair value” and firms will now be compelled to disclose more about their Level 3 assets, which comprise hard to value, illiquid or riskier assets that require evaluated pricing methodologies rather than market prices. Firms will therefore have to disclose more about the models, processes and assumptions that have gone into the fair valuation methodology used to price these assets.

Firms will also have to disclose whether any of the securities they value have moved from the Level 1 bracket to Level 2, or vice versa. This is to ensure the regulatory community is apprised of these market dynamics and any potential systemic risks in the making are flagged.

These developments were noted by DMRAV panellists and John Lynch, global head of pricing at Alliance Bernstein (who previously worked for Bloomberg), said: “The more assumptive data you have to hand, the better armed you are to be able to meet client and auditor questions about pricing and valuations. We are involved in increasingly academic conversations about this data.”

Firms such as Alliance Bernstein are therefore putting pressure on their vendor partners to provide a higher level of granular data about the inputs into and the methodologies used to determine an evaluated price. Lynch highlighted the importance of this data, given that pricing is critical for trading opportunities and feeds mission critical risk systems. Transparency into these prices is therefore paramount, Lynch explained: “It all comes back to the vendors being able to provide information about the colour that has been added to the pricing data and how the price relates to those for other securities.”

SIX Telekurs’ head of evaluated pricing research and development Perry Beaumont indicated that his firm has seen a lot more of a focus on the Level 1, 2 and 3 data inputs from firms in order to enable clear differentiation between these categories. Brian Buzzelli, head of pricing and reference data for the Americas region at rival vendor Thomson Reuters, noted a particularly strong push to this end in the fixed income and derivatives space, given the level of regulatory scrutiny being directed at these markets, which has resulted in the need for more “credible ingredients” to go into the pricing mix via “higher dimensions of transparency”.

Mark Abramowitz, director of US taxables at S&P Securities Evaluations, added that the desire to “kick the tyres” of the pricing models has significantly increased over recent years.

The increased pressure for timeliness with regards to data was also noted by vendor panellists, all of who indicated that they have received more requests for intraday pricing recently than ever before. Buzzelli added, however, that a balance between timeliness and accuracy of data needs to be reached in order to ensure the reliability of pricing data.

This focus on data quality and reliability was, in fact, a recurring theme throughout the conference and many other speakers highlighted the new industry imperative to prioritise data management concerns.

Related content

WEBINAR

Recorded Webinar: A new way of collaborating with data

Digital transformation in the financial services sector has raised many questions around data, including the cost and volume of reference data required by each financial institution. Firms want to pick and choose the reference data they need to fulfil their requirements. Emerging solutions with the potential to decrease the cost of data and increase flexibility...

BLOG

Don’t Miss It – A-Team Group’s Data Management Summit USA Virtual is Just Weeks Away

Data monetisation, data strategy to drive business outcomes, data discovery and intelligence, the power of data lineage, how to deliver an ESG data strategy and, necessarily, regulatory reporting challenges and the data management response, are just some of the key topics that leading capital markets’ participants and innovative solutions vendors will discuss at A-Team Group’s...

EVENT

RegTech Summit APAC Virtual

RegTech Summit APAC will explore the current regulatory environment in Asia Pacific, the impact of COVID on the RegTech industry and the extent to which the pandemic has acted a catalyst for RegTech adoption in financial markets.

GUIDE

The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...