The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Greater Importance Being Placed on Assumptive Data for Valuations, Agree DMRAV Panellists

Increased importance is being placed on providing the assumptive data inputs that are going into pricing and valuations calculations, according to panellists at last month’s A-Team Group Data Management for Risk, Analytics and Valuations (DMRAV) conference in NYC. This data is required as a result of changes to accounting rules related to fair value by both the Financial Accounting Standards Board (FASB) and the International Accounting Standards Board (IASB) in the post-crisis landscape.

Last month, the two bodies finally agreed on a global definition for “fair value” and firms will now be compelled to disclose more about their Level 3 assets, which comprise hard to value, illiquid or riskier assets that require evaluated pricing methodologies rather than market prices. Firms will therefore have to disclose more about the models, processes and assumptions that have gone into the fair valuation methodology used to price these assets.

Firms will also have to disclose whether any of the securities they value have moved from the Level 1 bracket to Level 2, or vice versa. This is to ensure the regulatory community is apprised of these market dynamics and any potential systemic risks in the making are flagged.

These developments were noted by DMRAV panellists and John Lynch, global head of pricing at Alliance Bernstein (who previously worked for Bloomberg), said: “The more assumptive data you have to hand, the better armed you are to be able to meet client and auditor questions about pricing and valuations. We are involved in increasingly academic conversations about this data.”

Firms such as Alliance Bernstein are therefore putting pressure on their vendor partners to provide a higher level of granular data about the inputs into and the methodologies used to determine an evaluated price. Lynch highlighted the importance of this data, given that pricing is critical for trading opportunities and feeds mission critical risk systems. Transparency into these prices is therefore paramount, Lynch explained: “It all comes back to the vendors being able to provide information about the colour that has been added to the pricing data and how the price relates to those for other securities.”

SIX Telekurs’ head of evaluated pricing research and development Perry Beaumont indicated that his firm has seen a lot more of a focus on the Level 1, 2 and 3 data inputs from firms in order to enable clear differentiation between these categories. Brian Buzzelli, head of pricing and reference data for the Americas region at rival vendor Thomson Reuters, noted a particularly strong push to this end in the fixed income and derivatives space, given the level of regulatory scrutiny being directed at these markets, which has resulted in the need for more “credible ingredients” to go into the pricing mix via “higher dimensions of transparency”.

Mark Abramowitz, director of US taxables at S&P Securities Evaluations, added that the desire to “kick the tyres” of the pricing models has significantly increased over recent years.

The increased pressure for timeliness with regards to data was also noted by vendor panellists, all of who indicated that they have received more requests for intraday pricing recently than ever before. Buzzelli added, however, that a balance between timeliness and accuracy of data needs to be reached in order to ensure the reliability of pricing data.

This focus on data quality and reliability was, in fact, a recurring theme throughout the conference and many other speakers highlighted the new industry imperative to prioritise data management concerns.

Related content

WEBINAR

Recorded Webinar: Entity identification and client lifecycle management – How financial institutions can drive $4 billion in cost savings

A new model in Legal Entity Identifier (LEI) issuance has created significant opportunities for financial institutions to capitalise on their KYC and AML due diligence. By becoming Validation Agents and obtaining LEIs on behalf of their clients, financial institutions can enhance their client onboarding experience, streamline their internal operations, and open the door to new,...

BLOG

Lack of Equivalence Post Brexit Raises Costly Data Management Concerns

The UK has finally left the European Union. A trade deal was wrung out at the eleventh hour, allowing the two sides to trade with zero tariffs and quotas, but the financial services industry has been left hanging – with talk of an MoU around the regulation of financial services being reached by March 2021...

EVENT

RegTech Summit New York City

Now in its 5th year, the RegTech Summit in NYC explores how the North American financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Entity Data Management Handbook – Second Edition

Entity data management is this year’s hot topic as financial firms focus on entity data to gain a better understanding of customers, improve risk management and meet regulatory compliance requirements. Data management programmes that enrich the Legal Entity Identifier with hierarchy data and links to other datasets can also add real value, including new business...