About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Fair Value May be a Tricky Proposition But it is Worth the Trouble, Agree Speakers at SIX Telekurs Event

Subscribe to our newsletter

Accounting standards have come under a lot of scrutiny of late, both from the industry and regulators, but despite its flaws, fair value measurement has been created for a good reason, agreed the panel at SIX Telekurs’ recent “What Price Fair Value?” event in London. However, the volatility in the market has made pricing and fair value measurement a tricky proposition and this is why you have to work closely with your vendors, said Matthew Cox, head of securities data management in Europe for BNY Mellon Asset Servicing.

“In the fourth quarter of last year the volumes of exceptions within our system went through the roof because exchanges and other trading venues experienced movements of more than 5%, even as high as 10% in some cases,” said Cox, explaining the challenges of portfolio valuations in the current market environment. This volatility triggered the automatic checks and controls around pricing and caused Cox’s team to have to deal with five days of work in one day. “I’m still not sure how we coped with it all,” he added.

The leap in the number of exceptions was certainly a dramatic one for the bank to cope with: Cox elaborated that the team was faced with around 2,000 exceptions rather than the usual 200. He put the success in dealing with this data challenge down to good management and teamwork, but added that this would not be sustainable in the long term.

Mike Jenkins, senior manager at Ernst & Young, also examined the benefits and disadvantages of using fair value accounting in light of the volatile market and the regulatory move towards global standards in this space. He explained that volatility such as that experienced during the crisis could mean fair value is less reliable than other accounting measures and that it has also been blamed for contributing to the procyclicality of the market. However, he added that the benefits of the global community all using the same fundamental standards for fair value measurement would be increased comparability between different jurisdictions.

Jenkins and Richard Newbury, market development manager at data vendor SIX Telekurs, examined the upcoming regulatory changes and how they could impact the financial services community with regards to pricing practices. To this end, Newbury highlighted changes to Basel, MiFID, credit ratings agencies regulations and the regulation of the alternative investment fund industry as relevant to the space.

In the meantime, Cox recommended working closely with data vendors and solution providers to tackle the issues due to volatile markets and cope with changes in the pricing environment. His team, which is focused on dealing with corporate actions, securities pricing and securities master file data for BNY Mellon’s transfer agency, custody and middle office operations, have to produce daily net asset value (NAV) calculations and work with vendors to get the data ready on time. “We are very dependent on our external data vendors across our asset classes and we have a two hour window in which to produce the data,” he said.

The pressure to manage the time and cost challenges of the market with the same or lower headcount than prior to the market downturn is a struggle for most firms with regards to the valuations space and other areas of data management. However, there is a great deal more attention being paid to the valuations area than many other data areas. As proof of this fact, Guy Sears, chair of the event and director of the wholesale division of the Investment Management Association (IMA), indicated that fair value measurement was cited as one of the top three ongoing concerns for his members.

Cox indicated that the idea behind valuations is to eliminate any need for judgement in the process via a structured approach. After all, it should be the front office making judgement calls around what constitutes a good price and not the back office. “Accuracy drives a firm’s reputation and we also need to be timely with this data, however there is a degree of subjectivity around fair value measurement that makes this data tricky to handle,” he said.

Illiquid markets mean that firms have to deal with stale prices and react to defaulted assets and liquidations in the fair value process. “Estimate” is not a good word in the pricing and valuations world, said Cox. He elaborated on an example involving Northern Rock and single line asset price adjustments in which BNY Mellon clients each had differing opinions regarding the true value of Northern Rock assets. “How do you get a single fair value based on all these opinions?”

Nigel Reynolds, who is in charge of business development activities, including sales and marketing at TD Waterhouse Corporate Services, added his own example to the mix: “We look after 22 corporate clients for valuations and we can see from their perspective that they have to rely on brokers to determine how prices are presented. For example, if you look at suspended stocks, some choose to value them at zero whereas others value them at the price at which they were suspended. It varies a great deal from firm to firm.”

Some firms have fair value committees in place from across the business, including representatives from risk and compliance, in order to add in an approval process for this sensitive data. However, BNY Mellon has decided against going down this route because it is more suited to the asset management model, said Cox. Fair value measurement necessarily involves a greater degree of control, which involves banks working with their vendors and clients to manage this data and expectations, he explained. “Finding the right price is key to the valuation process and that is the basic objective of fair value measurement,” he added.

Subscribe to our newsletter

Related content


Recorded Webinar: The challenges and potential of data marketplaces

Data is the lifeblood of capital markets. It is also a valuable commodity providing financial institutions with additional insight when gathered in an internal data marketplace, or packaged and sold externally to other institutions. While the theory is sound, the practice of setting up a data marketplace can be challenging. Internally, vast amounts of data...


Alveo Expands Relationship with FactSet to Offer Data-as-a-Service

Alveo has expanded its relationship with FactSet, which added FactSet ESG content to Alveo’s data management platform, with a collaboration that combines the companies’ data and data management capabilities to provide customers with solutions that integrate FactSet content into workflows and databases. The collaboration is designed to minimise time needed to onboard new data sets...


RegTech Summit APAC

Now in its 2nd year, the RegTech Summit APAC will bring together the regtech ecosystem to explore how capital markets in the APAC region can leverage technology to drive innovation, cut costs and support regulatory change. With more opportunities than ever before for RegTech to add value, now is the time to invest for the future. Join us to hear from leading RegTech practitioners and innovators who will share insights into how they are tackling the challenges of adopting and implementing regtech and how to advance your RegTech strategy.


ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...