About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Fair Value May be a Tricky Proposition But it is Worth the Trouble, Agree Speakers at SIX Telekurs Event

Subscribe to our newsletter

Accounting standards have come under a lot of scrutiny of late, both from the industry and regulators, but despite its flaws, fair value measurement has been created for a good reason, agreed the panel at SIX Telekurs’ recent “What Price Fair Value?” event in London. However, the volatility in the market has made pricing and fair value measurement a tricky proposition and this is why you have to work closely with your vendors, said Matthew Cox, head of securities data management in Europe for BNY Mellon Asset Servicing.

“In the fourth quarter of last year the volumes of exceptions within our system went through the roof because exchanges and other trading venues experienced movements of more than 5%, even as high as 10% in some cases,” said Cox, explaining the challenges of portfolio valuations in the current market environment. This volatility triggered the automatic checks and controls around pricing and caused Cox’s team to have to deal with five days of work in one day. “I’m still not sure how we coped with it all,” he added.

The leap in the number of exceptions was certainly a dramatic one for the bank to cope with: Cox elaborated that the team was faced with around 2,000 exceptions rather than the usual 200. He put the success in dealing with this data challenge down to good management and teamwork, but added that this would not be sustainable in the long term.

Mike Jenkins, senior manager at Ernst & Young, also examined the benefits and disadvantages of using fair value accounting in light of the volatile market and the regulatory move towards global standards in this space. He explained that volatility such as that experienced during the crisis could mean fair value is less reliable than other accounting measures and that it has also been blamed for contributing to the procyclicality of the market. However, he added that the benefits of the global community all using the same fundamental standards for fair value measurement would be increased comparability between different jurisdictions.

Jenkins and Richard Newbury, market development manager at data vendor SIX Telekurs, examined the upcoming regulatory changes and how they could impact the financial services community with regards to pricing practices. To this end, Newbury highlighted changes to Basel, MiFID, credit ratings agencies regulations and the regulation of the alternative investment fund industry as relevant to the space.

In the meantime, Cox recommended working closely with data vendors and solution providers to tackle the issues due to volatile markets and cope with changes in the pricing environment. His team, which is focused on dealing with corporate actions, securities pricing and securities master file data for BNY Mellon’s transfer agency, custody and middle office operations, have to produce daily net asset value (NAV) calculations and work with vendors to get the data ready on time. “We are very dependent on our external data vendors across our asset classes and we have a two hour window in which to produce the data,” he said.

The pressure to manage the time and cost challenges of the market with the same or lower headcount than prior to the market downturn is a struggle for most firms with regards to the valuations space and other areas of data management. However, there is a great deal more attention being paid to the valuations area than many other data areas. As proof of this fact, Guy Sears, chair of the event and director of the wholesale division of the Investment Management Association (IMA), indicated that fair value measurement was cited as one of the top three ongoing concerns for his members.

Cox indicated that the idea behind valuations is to eliminate any need for judgement in the process via a structured approach. After all, it should be the front office making judgement calls around what constitutes a good price and not the back office. “Accuracy drives a firm’s reputation and we also need to be timely with this data, however there is a degree of subjectivity around fair value measurement that makes this data tricky to handle,” he said.

Illiquid markets mean that firms have to deal with stale prices and react to defaulted assets and liquidations in the fair value process. “Estimate” is not a good word in the pricing and valuations world, said Cox. He elaborated on an example involving Northern Rock and single line asset price adjustments in which BNY Mellon clients each had differing opinions regarding the true value of Northern Rock assets. “How do you get a single fair value based on all these opinions?”

Nigel Reynolds, who is in charge of business development activities, including sales and marketing at TD Waterhouse Corporate Services, added his own example to the mix: “We look after 22 corporate clients for valuations and we can see from their perspective that they have to rely on brokers to determine how prices are presented. For example, if you look at suspended stocks, some choose to value them at zero whereas others value them at the price at which they were suspended. It varies a great deal from firm to firm.”

Some firms have fair value committees in place from across the business, including representatives from risk and compliance, in order to add in an approval process for this sensitive data. However, BNY Mellon has decided against going down this route because it is more suited to the asset management model, said Cox. Fair value measurement necessarily involves a greater degree of control, which involves banks working with their vendors and clients to manage this data and expectations, he explained. “Finding the right price is key to the valuation process and that is the basic objective of fair value measurement,” he added.

Subscribe to our newsletter

Related content


Recorded Webinar: Transforming Data Experiences in Quantitative Research and Trading

For quantitative researchers and quant trading teams at banking and capital markets firms, the ability to access, integrate, and share data is critical. Data and how teams collaborate with data underpins the ability to generate alpha, perform execution analyses, and provide a modern and differentiated client experience. However, for most banks, legacy technology stacks and...


How to Harness the Value of Cloud for Scale and Agility

As financial services firms recognise the value of data and benefits of data-led processes, many are undergoing digital transformations that put data at the heart of their operations and decision making. For most, this is providing competitive edge as data, and the insights it brings, enrich decisions and improve productivity – but not all digital...


FinCrime Tech Briefing, New York

RegTech Insight (from A-Team Group) is proud to announce the launch of its FinCrime Tech Briefing taking place in both London and New York this summer and focusing on RegTech for AML and Financial Crime Compliance.


Regulatory Data Handbook – Second Edition

Need to know all the essentials about the regulations impacting data management? A-Team’s Regulatory Data Handbook is a great way to see at-a-glance: All the regulations that are impacting data management today A description of each regulation The impact each will have from a data and data management perspective Messages from sponsors with products related to...