About a-team Marketing Services

A-Team Insight Blogs

A-Team Group Webinar Offers Insight into Improving Data Contributions

Subscribe to our newsletter

Data contributions to financial benchmarks have been a cause for concern since the Libor scandal emerged in 2012, yet despite heavy fines imposed on banks that manipulated the rate, only now are efforts to improve the situation being made in earnest.

Addressing the issues of data contributions to the market, an A-Team Group webinar entitled ‘Bracing for the Wave – or Sailing Ahead of It? – Reducing Risk Through Benchmark Data Controls’, considered the knock-on effects of the Libor scandal and how financial firms can guard against the kind of deviant behaviour that caused it. Andrew Delaney, A-Team Group editor-in-chief, moderated the webinar and set the scene, questioning the extent of the Libor scandal and how firms can ready themselves for increasing regulatory scrutiny of data contributions made to benchmarks and indices.

Professor Michael Mainelli, executive chairman of think tank Z/Yen Group, described the run-up to the Libor scandal, noting the establishment of the benchmark in 1986 and recognition that it was being manipulated as early as 2005. He said: “A number of banks were colluding to fix the Libor rate. Authorities were informed, but from 2005 to 2009 they did nothing. In 2009, the SEC said it would investigate the issue, but from 2009 to 2012, UK authorities continued to do nothing. In 2012, the situation became political and something had to be done. By this time, over $3 trillion worth of financial products were tied to Libor. Banks that had manipulated Libor started to be fined and fines continue to be imposed as we are still unpicking the scandal.”

Noting the fundamental role of Libor in financial markets, yet a supine regulatory approach to Libor and other benchmarks that is only now beginning to change, Mainelli outlined one way in which banks can look at trades and discover any that might be suspicious. He promoted the use of automated surveillance based on statistical learning theory as a means of identifying deviant behaviour and suggested compliance should run statistical tests at all times to see what is happening on the trading floor.

On the Libor scandal he concluded: “Banks want to move on from the scandal, but they can’t as litigation is only just warning up. This one will run and run.”

With the scale of the Libor scandal and its aftermath set out, Delaney turned to the practicalities of avoiding deviant behaviour on the trading floor. Solution suppliers Robert Simpson, vice president of Verint’s global Financial Compliance Practice, and Tim Furmidge, head of product management for BT’s Financial Technology Services, proposed a number of options that can support the capture, processing and analysis of data to discover deviance, particularly unstructured data such as voice and chat data that can be difficult to manage.

Simpson described a surveillance platform including speech analytics that can be built using existing technologies and provide data capture, processing, analysis and decision making functionality for both structured and unstructured data. He suggested this type of platform can improve on the typical practice of compliance officers listening to voice recordings of data contributors, but cautioned that for a platform to be effective, methodologies need to be reviewed every quarter and measures put in place to prevent people from exercising inappropriate influence over benchmark submissions. He also noted the requirement to retain records of benchmark submissions and the information used to make them for five years, and the need to provide daily and quarterly reports covering methodologies and how any quantitative and qualitative criteria were used.

The outcomes of configuring a voice recording and speech analytics platform in this way include the ability to demonstrate that all communications are recorded, reduce the time needed to find relevant data, reduce headcount engaged in surveillance and spend more time analysing and less time searching data. Simpson commented: “By using technology that is available now, market practitioners can reduce the risk of operational and reputation risk. By investing in technology, they will see added value from call recording.”

Concurring with Simpson’s view of the benefits of automated surveillance of behaviour, BT’s Furmidge described how the underlying technology is evolving as firms move on from a tactical approach that manages silos of data, such as instant messages and email, fixed voice, mobile voice and trades, to comply with specific regulations, to a more coherent approach that captures, archives, retrieves and analyses all data to achieve compliance with multiple regulations.

He said: “The trend is towards a more coherent and common approach in which all channels of data are dealt with in a similar way. This makes it easier, for example, to recreate a trade as required by regulation. We are also seeing the need for a coherent approach to monitoring across countries, for example to comply with Dodd-Frank rules covering swaps trading. As we move forward with multiple capture engines at the point of entry, a more coherent archive and a common retrieve and analysis environment across all channels, it will be possible not only to capture data for regulatory purposes, but also to mine it to spot market opportunities.”

If this is the end game, Furmidge proposes that firms start the route to complete and coherent surveillance with a practical trial of, perhaps, voice analytics. This would include everything from a discovery session to identify spoken words a firm wants to find, to iterative improvements that drive up the accuracy of word recognition, and a final review that considers whether the trial has bettered manual methods of surveillance and met expectations.

While technology can provide solutions to problems such as the manipulation of Libor, ownership of the problems remains a critical yet outstanding issue. Mainelli said: “No-one seems to own the problems. At the moment, they are mostly a legal issue, but firms as a whole need to buck up. Some have started to withdraw from providing pricing to the market, but that is not good. The need is for firms to push on with doing better; they need to get a grip on how to manage indices or risk losing them.”

Similarly, the webinar participants agreed that shutting down communication channels such as chat is no more than a knee-jerk reaction to the problem of poor data contributions being made to the market. As Furmidge concluded: “We need to understand the art of the possible and we need more cooperation among regulators, companies, IT teams and suppliers to deliver complete and effective surveillance solutions to the market.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Leveraging interoperability: Laying the foundations for unique best-of-breed trading solutions

Interoperability on the trading desk promises more actionable insights, real-time decision making, faster workflows and reduced errors by ensuring data consistency across frequently used applications. But how can these promises be kept in an environment characterised by multiple applications and user interfaces, numerous workflows and technology vendors competing for space on the trader’s desktop? This...

BLOG

SmartStream Adds Exchange Notification Services to Reference Data Utility

SmartStream Technologies, provider of the Reference Data Utility (RDU), has released an Exchange Notification Service (ENS) designed to track, consolidate and normalise reference data notifications published by exchanges. The service was developed in partnership with clients and extends the services of the RDU, which offers a managed service for vendor-sourced reference data. With more than...

EVENT

AI in Capital Markets Summit New York

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...