The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

A-Team Group Webinar Offers Insight into Improving Data Contributions

Data contributions to financial benchmarks have been a cause for concern since the Libor scandal emerged in 2012, yet despite heavy fines imposed on banks that manipulated the rate, only now are efforts to improve the situation being made in earnest.

Addressing the issues of data contributions to the market, an A-Team Group webinar entitled ‘Bracing for the Wave – or Sailing Ahead of It? – Reducing Risk Through Benchmark Data Controls’, considered the knock-on effects of the Libor scandal and how financial firms can guard against the kind of deviant behaviour that caused it. Andrew Delaney, A-Team Group editor-in-chief, moderated the webinar and set the scene, questioning the extent of the Libor scandal and how firms can ready themselves for increasing regulatory scrutiny of data contributions made to benchmarks and indices.

Professor Michael Mainelli, executive chairman of think tank Z/Yen Group, described the run-up to the Libor scandal, noting the establishment of the benchmark in 1986 and recognition that it was being manipulated as early as 2005. He said: “A number of banks were colluding to fix the Libor rate. Authorities were informed, but from 2005 to 2009 they did nothing. In 2009, the SEC said it would investigate the issue, but from 2009 to 2012, UK authorities continued to do nothing. In 2012, the situation became political and something had to be done. By this time, over $3 trillion worth of financial products were tied to Libor. Banks that had manipulated Libor started to be fined and fines continue to be imposed as we are still unpicking the scandal.”

Noting the fundamental role of Libor in financial markets, yet a supine regulatory approach to Libor and other benchmarks that is only now beginning to change, Mainelli outlined one way in which banks can look at trades and discover any that might be suspicious. He promoted the use of automated surveillance based on statistical learning theory as a means of identifying deviant behaviour and suggested compliance should run statistical tests at all times to see what is happening on the trading floor.

On the Libor scandal he concluded: “Banks want to move on from the scandal, but they can’t as litigation is only just warning up. This one will run and run.”

With the scale of the Libor scandal and its aftermath set out, Delaney turned to the practicalities of avoiding deviant behaviour on the trading floor. Solution suppliers Robert Simpson, vice president of Verint’s global Financial Compliance Practice, and Tim Furmidge, head of product management for BT’s Financial Technology Services, proposed a number of options that can support the capture, processing and analysis of data to discover deviance, particularly unstructured data such as voice and chat data that can be difficult to manage.

Simpson described a surveillance platform including speech analytics that can be built using existing technologies and provide data capture, processing, analysis and decision making functionality for both structured and unstructured data. He suggested this type of platform can improve on the typical practice of compliance officers listening to voice recordings of data contributors, but cautioned that for a platform to be effective, methodologies need to be reviewed every quarter and measures put in place to prevent people from exercising inappropriate influence over benchmark submissions. He also noted the requirement to retain records of benchmark submissions and the information used to make them for five years, and the need to provide daily and quarterly reports covering methodologies and how any quantitative and qualitative criteria were used.

The outcomes of configuring a voice recording and speech analytics platform in this way include the ability to demonstrate that all communications are recorded, reduce the time needed to find relevant data, reduce headcount engaged in surveillance and spend more time analysing and less time searching data. Simpson commented: “By using technology that is available now, market practitioners can reduce the risk of operational and reputation risk. By investing in technology, they will see added value from call recording.”

Concurring with Simpson’s view of the benefits of automated surveillance of behaviour, BT’s Furmidge described how the underlying technology is evolving as firms move on from a tactical approach that manages silos of data, such as instant messages and email, fixed voice, mobile voice and trades, to comply with specific regulations, to a more coherent approach that captures, archives, retrieves and analyses all data to achieve compliance with multiple regulations.

He said: “The trend is towards a more coherent and common approach in which all channels of data are dealt with in a similar way. This makes it easier, for example, to recreate a trade as required by regulation. We are also seeing the need for a coherent approach to monitoring across countries, for example to comply with Dodd-Frank rules covering swaps trading. As we move forward with multiple capture engines at the point of entry, a more coherent archive and a common retrieve and analysis environment across all channels, it will be possible not only to capture data for regulatory purposes, but also to mine it to spot market opportunities.”

If this is the end game, Furmidge proposes that firms start the route to complete and coherent surveillance with a practical trial of, perhaps, voice analytics. This would include everything from a discovery session to identify spoken words a firm wants to find, to iterative improvements that drive up the accuracy of word recognition, and a final review that considers whether the trial has bettered manual methods of surveillance and met expectations.

While technology can provide solutions to problems such as the manipulation of Libor, ownership of the problems remains a critical yet outstanding issue. Mainelli said: “No-one seems to own the problems. At the moment, they are mostly a legal issue, but firms as a whole need to buck up. Some have started to withdraw from providing pricing to the market, but that is not good. The need is for firms to push on with doing better; they need to get a grip on how to manage indices or risk losing them.”

Similarly, the webinar participants agreed that shutting down communication channels such as chat is no more than a knee-jerk reaction to the problem of poor data contributions being made to the market. As Furmidge concluded: “We need to understand the art of the possible and we need more cooperation among regulators, companies, IT teams and suppliers to deliver complete and effective surveillance solutions to the market.”

Related content

WEBINAR

Recorded Webinar: A new way of collaborating with data

Digital transformation in the financial services sector has raised many questions around data, including the cost and volume of reference data required by each financial institution. Firms want to pick and choose the reference data they need to fulfil their requirements. Emerging solutions with the potential to decrease the cost of data and increase flexibility...

BLOG

Alveo and AquaQ Partner to Integrate Alveo Prime with AquaQ kdb+

Alveo and AquaQ Analytics have partnered to offer advanced data management and analytics for financial services firms. An early deliverable is the integration of Alveo’s Prime data mastering and data quality management solution with AquaQ’s kdb+ data capture solution. The bi-directional integration allows users to take mastered pricing and reference data from Prime into kdb+...

EVENT

TradingTech Summit Virtual

TradingTech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...