About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Fair Value May be a Tricky Proposition But it is Worth the Trouble, Agree Speakers at SIX Telekurs Event

Subscribe to our newsletter

Accounting standards have come under a lot of scrutiny of late, both from the industry and regulators, but despite its flaws, fair value measurement has been created for a good reason, agreed the panel at SIX Telekurs’ recent “What Price Fair Value?” event in London. However, the volatility in the market has made pricing and fair value measurement a tricky proposition and this is why you have to work closely with your vendors, said Matthew Cox, head of securities data management in Europe for BNY Mellon Asset Servicing.

“In the fourth quarter of last year the volumes of exceptions within our system went through the roof because exchanges and other trading venues experienced movements of more than 5%, even as high as 10% in some cases,” said Cox, explaining the challenges of portfolio valuations in the current market environment. This volatility triggered the automatic checks and controls around pricing and caused Cox’s team to have to deal with five days of work in one day. “I’m still not sure how we coped with it all,” he added.

The leap in the number of exceptions was certainly a dramatic one for the bank to cope with: Cox elaborated that the team was faced with around 2,000 exceptions rather than the usual 200. He put the success in dealing with this data challenge down to good management and teamwork, but added that this would not be sustainable in the long term.

Mike Jenkins, senior manager at Ernst & Young, also examined the benefits and disadvantages of using fair value accounting in light of the volatile market and the regulatory move towards global standards in this space. He explained that volatility such as that experienced during the crisis could mean fair value is less reliable than other accounting measures and that it has also been blamed for contributing to the procyclicality of the market. However, he added that the benefits of the global community all using the same fundamental standards for fair value measurement would be increased comparability between different jurisdictions.

Jenkins and Richard Newbury, market development manager at data vendor SIX Telekurs, examined the upcoming regulatory changes and how they could impact the financial services community with regards to pricing practices. To this end, Newbury highlighted changes to Basel, MiFID, credit ratings agencies regulations and the regulation of the alternative investment fund industry as relevant to the space.

In the meantime, Cox recommended working closely with data vendors and solution providers to tackle the issues due to volatile markets and cope with changes in the pricing environment. His team, which is focused on dealing with corporate actions, securities pricing and securities master file data for BNY Mellon’s transfer agency, custody and middle office operations, have to produce daily net asset value (NAV) calculations and work with vendors to get the data ready on time. “We are very dependent on our external data vendors across our asset classes and we have a two hour window in which to produce the data,” he said.

The pressure to manage the time and cost challenges of the market with the same or lower headcount than prior to the market downturn is a struggle for most firms with regards to the valuations space and other areas of data management. However, there is a great deal more attention being paid to the valuations area than many other data areas. As proof of this fact, Guy Sears, chair of the event and director of the wholesale division of the Investment Management Association (IMA), indicated that fair value measurement was cited as one of the top three ongoing concerns for his members.

Cox indicated that the idea behind valuations is to eliminate any need for judgement in the process via a structured approach. After all, it should be the front office making judgement calls around what constitutes a good price and not the back office. “Accuracy drives a firm’s reputation and we also need to be timely with this data, however there is a degree of subjectivity around fair value measurement that makes this data tricky to handle,” he said.

Illiquid markets mean that firms have to deal with stale prices and react to defaulted assets and liquidations in the fair value process. “Estimate” is not a good word in the pricing and valuations world, said Cox. He elaborated on an example involving Northern Rock and single line asset price adjustments in which BNY Mellon clients each had differing opinions regarding the true value of Northern Rock assets. “How do you get a single fair value based on all these opinions?”

Nigel Reynolds, who is in charge of business development activities, including sales and marketing at TD Waterhouse Corporate Services, added his own example to the mix: “We look after 22 corporate clients for valuations and we can see from their perspective that they have to rely on brokers to determine how prices are presented. For example, if you look at suspended stocks, some choose to value them at zero whereas others value them at the price at which they were suspended. It varies a great deal from firm to firm.”

Some firms have fair value committees in place from across the business, including representatives from risk and compliance, in order to add in an approval process for this sensitive data. However, BNY Mellon has decided against going down this route because it is more suited to the asset management model, said Cox. Fair value measurement necessarily involves a greater degree of control, which involves banks working with their vendors and clients to manage this data and expectations, he explained. “Finding the right price is key to the valuation process and that is the basic objective of fair value measurement,” he added.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to optimise SaaS data management solutions

Software-as-a-Service (SaaS) data management solutions go hand-in-hand with cloud technology, delivering not only SaaS benefits of agility, a reduced on-premise footprint and access to third-party expertise, but also the fast data delivery, productivity and efficiency gains provided by the cloud. This webinar will focus on the essentials of SaaS data management, including practical guidance on...

BLOG

Brown Brothers Harriman Evolves Data Offerings with Infomediary Data Solutions

Brown Brothers Harriman (BBH) has announced the next evolution of its data offerings with Infomediary Data Solutions, an expanded set of solutions that brings together data management technology and managed services and is designed to help asset managers and financial institutions take command of their data. Infomediary Data Solutions builds on BBH’s Infomediary data integration...

EVENT

Buy AND Build: The Future of Capital Markets Technology, London

Buy AND Build: The Future of Capital Markets Technology London on September 19th at Marriott Hotel Canary Wharf London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...