The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Reference Data with Andrew Delaney: Why We Care About the LEI

Maybe like me you’ve been slightly bewildered by the comings and goings in the market’s attempts to come up with a plan for a steadfast legal entity identifier (LEI) that everyone can use? Particularly the frenetic pace of sometimes daily change in direction, as stakeholders switched allegiances, and adopted and dropped plans, and interested parties spake truths and half-truths about the woeful consequences (or otherwise) of getting things wrong.

Thought as much.

So, I figured it would be a good time to take a little stock of what has occurred particularly over the past couple of months, which has seen the marketplace swerve violently from a cosy fait accompli solution (involving Swift and DTCC) to the prospect of a highly federated LEI system, to be proposed to G20 next week by the Financial Stability Board (FSB).

To me, the most compelling question to be answered is this: Why do we care? Why is our marketplace getting so hot under the collar about something so basic as an identification system?

For the marketplace in its most generic form – from the investing public through to the financial institutions and utilities that serve the – the reason for a robust and standard LEI is compelling: At least part of the severity of the Credit Crunch is attributable to the fact that investing entities were not able to ascertain precisely who they were exposed to, due to the fact that the hierarchy of issuers of certain paper was opaque at best and unintelligible in many cases.

The Lehman Brothers example – cited a few years back in an A-Team white paper here – is a powerful one. Our paper suggested that upwards of 11,000 securities had been issued by Lehman Brothers and were in circulation at the time of its demise. Many, it seems, weren’t obviously identified as Lehman paper. The LEI would preclude a recurrence, or so it’s hoped.

With such a compelling business case, and the stick of Dodd Frank and its many imitators behind it, the LEI initiative is getting the attention of the institutional marketplace. A new identifier whose adoption is enforced by regulators necessarily will require financial institutions to act. This will mean setting in place systems and processes – a mix of machines and humans – for receiving and handling the new LEI and the reference data associated with it, described by FSB as “name, address, and basic ownership information”.

In truth, most firms have most of this equipment and people in place, albeit in use in other areas of the business. So, we’re not expecting a huge capex hit or hiring blitz solely for LEI. That said, recent predictions of total market spending on LEI-related projects – one came in at less than $200 million a year – seem to us to be understated.

For one thing, a not-insignificant portion of firms’ existing infrastructures and people will need to be allocated to the LEI implementation and maintenance task. Speaking at our recent Data Management Summit in London, practitioners expressed some angst at the task in hand, with RBS’s Colin Gibson and Cossiom’s David Berry warning of higher workloads and, perhaps more onerously, market confusion.

What’s clear is that implementing this change won’t be a trivial task.

The other aspect to this, of course, is the cost of the identifier itself and associated reference data. We listened in to the FSB’s announcement of its plans last Friday, at which it stated: “The model must ensure that the system is based on non-profit cost recovery and that there are no monopoly rents accruing to service providers.”

It’s a laudable sentiment and it demonstrates that FSB has been listening to the marketplace in terms of the fear of a major data cost hit at a time when the rigours of new regulatory compliance is already stretching IT budgets and minds.

But within the FSB’s proposals for a federated LEI solution there lies (or lurks, if you look at it that way) opportunity for commercial benefit. It design, FSB says, “could give national agencies, perhaps exchanges or data vendors, the role of allocating, registering and storing LEIs” (although it wags its finger thus: “there should be no ‘bundling’ of other services alongside the LEI by providers which forces users to pay directly or indirectly for the LEI”). Who’d countenance such a thing?

This, to our minds, is where recent market sizing estimates understate the financial impact the marketplace is facing.

A month or two ago we did some relatively simple math for a client keen to understand the size of the marketplace for LEI data. To be fair, this wasn’t a ‘bottom-up’ map of the market type of exercise of the kind we really like to get our teeth into. It was a quick hit, ‘top-down’ assessment of what the key information providers are realistically able and expecting to see in terms of revenues from LEI-related data. The figure we came up with was an annual run-rate of between $350 million and $450 million on data alone (call us for details).

So, the last part of the answer to the ‘why do we care’ question is surely: how about close to half a billion dollars of new annual revenue potential? It’s enough to prick up most suppliers’ ears.

We think LEI represents an important development in the reference data marketplace. We’ll be watching proceedings with a keen eye, and expect a heightened level of our own LEI activity in the autumn.

Watch this space.

Related content

WEBINAR

Recorded Webinar: Sanctions – The new pre-trade challenge for the buy-side

Sanctions screening at the security level is a relatively recent requirement for the buy-side. It dives deeper than traditional KYC and AML screening and is immensely challenging as firms must monitor frequently changing sanctions lists, source up-to-date sanctions data and beneficial ownership data, and integrate these to screen growing lists of potentially sanctioned securities. As...

BLOG

Refinitiv Integrates World-Check and EPIC to Fortify the Fight Against Fraud

Refinitiv has integrated World-Check and the EPIC platform from GIACT, which it acquired late last year, behind a single API, bringing together the former’s risk intelligence data and the latter’s ability to positively identify and authenticate customers in the fight against fraud and risk-related threats. “For over a decade, legacy solutions have failed to adequately...

EVENT

ESG Data & Tech Summit 2022

This summit will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...