About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Talking Reference Data with Andrew Delaney: Why We Care About the LEI

Subscribe to our newsletter

Maybe like me you’ve been slightly bewildered by the comings and goings in the market’s attempts to come up with a plan for a steadfast legal entity identifier (LEI) that everyone can use? Particularly the frenetic pace of sometimes daily change in direction, as stakeholders switched allegiances, and adopted and dropped plans, and interested parties spake truths and half-truths about the woeful consequences (or otherwise) of getting things wrong.

Thought as much.

So, I figured it would be a good time to take a little stock of what has occurred particularly over the past couple of months, which has seen the marketplace swerve violently from a cosy fait accompli solution (involving Swift and DTCC) to the prospect of a highly federated LEI system, to be proposed to G20 next week by the Financial Stability Board (FSB).

To me, the most compelling question to be answered is this: Why do we care? Why is our marketplace getting so hot under the collar about something so basic as an identification system?

For the marketplace in its most generic form – from the investing public through to the financial institutions and utilities that serve the – the reason for a robust and standard LEI is compelling: At least part of the severity of the Credit Crunch is attributable to the fact that investing entities were not able to ascertain precisely who they were exposed to, due to the fact that the hierarchy of issuers of certain paper was opaque at best and unintelligible in many cases.

The Lehman Brothers example – cited a few years back in an A-Team white paper here – is a powerful one. Our paper suggested that upwards of 11,000 securities had been issued by Lehman Brothers and were in circulation at the time of its demise. Many, it seems, weren’t obviously identified as Lehman paper. The LEI would preclude a recurrence, or so it’s hoped.

With such a compelling business case, and the stick of Dodd Frank and its many imitators behind it, the LEI initiative is getting the attention of the institutional marketplace. A new identifier whose adoption is enforced by regulators necessarily will require financial institutions to act. This will mean setting in place systems and processes – a mix of machines and humans – for receiving and handling the new LEI and the reference data associated with it, described by FSB as “name, address, and basic ownership information”.

In truth, most firms have most of this equipment and people in place, albeit in use in other areas of the business. So, we’re not expecting a huge capex hit or hiring blitz solely for LEI. That said, recent predictions of total market spending on LEI-related projects – one came in at less than $200 million a year – seem to us to be understated.

For one thing, a not-insignificant portion of firms’ existing infrastructures and people will need to be allocated to the LEI implementation and maintenance task. Speaking at our recent Data Management Summit in London, practitioners expressed some angst at the task in hand, with RBS’s Colin Gibson and Cossiom’s David Berry warning of higher workloads and, perhaps more onerously, market confusion.

What’s clear is that implementing this change won’t be a trivial task.

The other aspect to this, of course, is the cost of the identifier itself and associated reference data. We listened in to the FSB’s announcement of its plans last Friday, at which it stated: “The model must ensure that the system is based on non-profit cost recovery and that there are no monopoly rents accruing to service providers.”

It’s a laudable sentiment and it demonstrates that FSB has been listening to the marketplace in terms of the fear of a major data cost hit at a time when the rigours of new regulatory compliance is already stretching IT budgets and minds.

But within the FSB’s proposals for a federated LEI solution there lies (or lurks, if you look at it that way) opportunity for commercial benefit. It design, FSB says, “could give national agencies, perhaps exchanges or data vendors, the role of allocating, registering and storing LEIs” (although it wags its finger thus: “there should be no ‘bundling’ of other services alongside the LEI by providers which forces users to pay directly or indirectly for the LEI”). Who’d countenance such a thing?

This, to our minds, is where recent market sizing estimates understate the financial impact the marketplace is facing.

A month or two ago we did some relatively simple math for a client keen to understand the size of the marketplace for LEI data. To be fair, this wasn’t a ‘bottom-up’ map of the market type of exercise of the kind we really like to get our teeth into. It was a quick hit, ‘top-down’ assessment of what the key information providers are realistically able and expecting to see in terms of revenues from LEI-related data. The figure we came up with was an annual run-rate of between $350 million and $450 million on data alone (call us for details).

So, the last part of the answer to the ‘why do we care’ question is surely: how about close to half a billion dollars of new annual revenue potential? It’s enough to prick up most suppliers’ ears.

We think LEI represents an important development in the reference data marketplace. We’ll be watching proceedings with a keen eye, and expect a heightened level of our own LEI activity in the autumn.

Watch this space.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to automate entity data management and due diligence to ensure efficiency, accuracy and compliance

Requesting, gathering, analysing and monitoring customer, vendor and partner entity data is time consuming, and often a tedious manual process. This can slow down customer relationships and expose financial institutions to risk from inaccurate, incomplete or outdated data – but there are solutions to these problems. This webinar will consider the challenges of sourcing and...

BLOG

The Long and Winding Road to T+1 – A 20+ Year Journey

By Adrian Sharp, Principal, Fairfield Insights. What a long, strange trip it’s been. (Weir, Garcia et al, 1970) Earlier this year, the SEC published the final rules shortening the settlement cycle for US securities from T+2 to T+1 with a target compliance date of May 28, 2024. The case for a shortened settlement cycle remains...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...