About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Reference Data with Andrew Delaney: Feeling the Collars of the Libor Traders

Subscribe to our newsletter

Financial institutions are bracing themselves for further regulatory clampdown on market pricing on the wake of the LIBOR and FX market manipulation scandals. As regulators draft new guidelines for benchmark data contributions, and UK Chancellor George Osborne signals increased scrutiny of the London FX market’s pricing procedures, data managers and compliance officers are assessing their processes to institute best practice – and guard against the risk of further financial penalties.

So far the price has been high, with recent fines relating to LIBOR alone exceeding $7 billion and the Financial Times predicting an eventual total of as much as $22 billion!

In response to the ensuing public outrage, regulators are starting to act. Several have prescribed new measures to safeguard a repetition, and many institutions are assessing what they should be doing to ensure their contributions are accurate, timely and honest.

But what is best practice? And what can firms do to secure control over the prices and rates their traders – and increasingly, pricing engines and chat lines – distribute to the marketplace?

As you know these questions have been vexing me for quite some time, not least because we’ve been noticing a number of prospective industry solutions emerging that make use of monitoring technologies to keep tabs on the proprietary pricing data that makes its way to the markets in various ways.

We’ve investigated possibilities in a number of ways, the most recent of which has been our preparation for next week’s webinar on the topic, ‘Bracing for the Wave—or Sailing Ahead of It?’; Reducing Risk Through Benchmark Data Controls. Featuring speakers from Verint, BT and Z/Yen, this webinar looks at how firms can do more to monitor their traders’ voice, data and chat activities and institute controls to ensure compliance with internal policy and new regulatory guidelines. Our panel of experts will discuss approaches to instituting robust and consistent monitoring across all pricing channels, and offer practical advice on technology solutions and organisational methodologies to ensure best practice.

It’s a topic I plan to explore more at our Data Management and Intelligent Trading Summits in London and New York later in the year. In the meantime, however, be sure to join us at 3pm London time (10am New York), next Wednesday July 2. You can register free of charge here.

Relatedly, we just completed a white paper with infrastructure monitoring specialist ITRS that looks more broadly at firms’ data contributions to industry benchmarks, including indices, and fragmented price-driven markets, like Swap Execution Facilities (SEFs).

Changing market dynamics are placing new emphasis on market-makers and other generators of proprietary contributed market pricing to ensure the integrity of their data. Structural shifts like the introduction of exchange-like (SEFs) are exacerbating the pressures on financial institutions to get their pricing systems in good shape.

Concerns about data integrity are impacting a wide range of asset classes, including over-the-counter markets – like fixed income, foreign exchange and derivatives – and exchange-listed so-called structured instruments – like exchange-traded funds (ETFs), warrants, turbo-warrants, certificates and indices.

Meanwhile, as firms seek to monetise the data they generate from their business activities, new demands for quality control are driving a renewed look at data integrity. With data sales emerging as a significant element of the overall revenue picture, financial institutions are striving to adopt best practices in response to customer demand for higher- quality data services.

Our new paper – ‘Ensuring Integrity of Proprietary & Derived Market Information’ – discusses the obstacles to establishing robust pricing processes, and examines the market dynamics driving efforts towards improved data integrity and where best practices are evolving. You can download it free of charge here.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to simplify and modernize data architecture to unleash data value and innovation

The data needs of financial institutions are growing at pace as new formats and greater volumes of information are integrated into their systems. With this has come greater complexity in managing and governing that data, amplifying pain points along data pipelines. In response, innovative new streamlined and flexible architectures have emerged that can absorb and...

BLOG

ESMA Final Report Recommends EU Transition to T+1 Settlement Cycle by October 2027

The European Securities and Markets Authority (ESMA), the EU’s financial markets regulator, has published its Final Report assessing the implications of transitioning to a T+1 settlement cycle within the European Union (EU). The proposed move is aimed at improving settlement efficiency, reducing risks, and aligning EU practices with other major jurisdictions globally. The report highlights...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...