About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Reference Data with Andrew Delaney: The Humpty Dumpty Factor

Subscribe to our newsletter

Humpty Dumpty famously said:

“When I use a word, it means just what I choose it to mean—neither more nor less.”

I think he was on to something.

As I move within reference data circles – recently in London, Zurich, Frankfurt and New York – I find that the term reference data can be applied to a broad church of data disciplines. Old favourites include end-of-day pricing data, security master files and identifiers, valuations, corporate actions and entity identifiers. At a stretch, we could include calendars, perhaps. Tax information. And on.

But in my travels I’ve recently noticed a propensity among reference data practitioners to get passionate – nay, even hot under the collar – about data types heretofore considered outside of our remit here at Reference Data Review.

Two in particular have piqued my interest, and in line with Humpty’s pronouncement I’m going to add them to my personal list of what I consider to be reference data.

The first is index data.

As you may recall, at FIMA DACH in Frankfurt the other week I stumbled across a panel discussion that indicated a degree of outrage at the licensing fees and fee structures of some of the major providers of index calculations and the underlying data they need to function. Indeed, the panel alerted us to the fact that the EU is embarking on an inquiry into market practices of index providers, a study that kicked off last month.

What’s clear is that in these straitened times, scrutiny of data costs is extending into new areas. The result is that index data – for some reason held until now in greater reverence than boring old market data, for example – is being put through the cost and consolidation ringer, just like everything else. It will be interesting to watch how that’s going to pan out, and you’ll be able to read about it here.

The second is benchmark pricing, and by extension the whole of the contributed data segment.

The Libor scandal has truly opened a can of worms. Not only are the culprits being castigated for lying in their Libor contributions, but the issue has forced banks to ask the question: where are we sourcing our contributed data, how is it being distributed, and to whom and for what purpose?

I’ll be looking at this in more depth in the coming weeks. But what started as a set of advertisements – bank currency rates on Reuters FXFX, for example – is now a much more serious proposition.

The consumer has changed, from a bank or asset manager counterparty making a call to the contributing bank to get a current market rate, to a machine that may take the rate and pump out a bunch of trades based on its value. And yet the process for sourcing that rate may not have changed in 30 years. Are banks aware of what rates they are posting to Thomson Reuters, Bloomberg and the rest? Do they know what systems – or spreadsheets, or people – came up with the rates? Can they stand by those rates?

We’ll be delving into this as the market wrests with how it’s going to calculate Libor (and the other bors) going forward.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...

BLOG

EDM Council Introduces Data Excellence Program

The EDM Council has introduced a Data Excellence Program offering standardised measurement and recognition of data management excellence at the organisational level. The initiative aims to acknowledge organisations that are dedicated to continuous improvement and excellence in data management based on globally recognised best practices. Key elements of the Data Excellence Program include: Data management...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...