About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit: Managing Entity Data with the Global LEI

Subscribe to our newsletter

The global Legal Entity Identifier (LEI) system is growing in terms of the number of operational pre-Local Operating Units (LOUs) within the system and the number of LEIs they have issued, but it remains incomplete and continues to pose data management problems without a Central Operating Unit (COU).

The status of the global LEI system and the use of LEIs to manage entity data was discussed during a panel session at last week’s A-Team Group Data Management Summit in New York. Contributing to the session, Beyond Dodd-Frank: Managing Entity Data with the Global LEI, were experts Marc Alvarez, senior director of reference data infrastructure at Interactive Data Corporation; Steve Goldstein, co-founder, chairman and CEO at Alacra; Scott Preiss, vice president and chief operating officer at Cusip Global Services, and Ron Jordan, managing director and chief data officer of data services at DTCC.

Moderator Sarah Underwood, editor at A-Team Group, set the scene with a quick review of the development of the global LEI system before asking panel members to comment on its status and efficacy. Jordan noted that 265,000 LEIs have been issued globally, with 65% covering European entities and 27% covering US entities. Preiss added statistics on pre-LOUs, stating that 14 are in action and more are planning to join the system. He also mentioned a letter from SIFMA encouraging greater adoption of the LEI in the US.

Turning to how the system is working, Alvarez said: “I think we are off to a good start. The LEI is not just about a standard for issuing 20-character codes, but also about meta data that describes information such as entity location. LEIs are also different because they work in a federated peer-to-peer model, which means there is a lot to reconcile, but if we get reconciliation right, we will introduce multiple levels of accuracy. We are broadly optimistic about the LEI and would like to go further with it, but these are early days.”

Jordan added: “The Regulatory Oversight Committee of the global system includes 65 regulators and is overseeing developments until the COU is formed. It’s tough work as a federated model has pros and cons. Another challenge is politics, as pre-LOUs in different parts of the world operate under different jurisdictions. How do we balance the needs of regulators who want a global system with those of local data practitioners who want to format data according to their own local systems? A common data file format is being published for all pre-LOUs to use and it should help in creating a consolidated LEI file. I think it is the COU’s job to create the consolidated file, but the COU has not yet been formed so we don’t know if that will happen.”

Picking up on the federated element of the system, Preiss commented: “There is history showing how a federated model can work. The Association of National Numbering Agencies (ANNA) works on a local and global level generating ISINs and related data. This holds great promise for how the LEI could work.”

A quick show of hands in the audience demonstrated few firms using the LEI as yet and those few struggling with poor data quality. On this issue, Goldstein said: “The problem is caused by setting up the COU after the fact. This is like playing basketball

with no referee and then calling out the fouls after the game. There are challenges around the accuracy of some of the data and some records don’t seem to be validated at all. Funds, sub-funds, master funds and so on are each handled differently by different pre-LOUs.”

Jordan pointed out that part of the data quality problem is 20% to 50% of entities getting the details of their own information wrong in the first place. He commented: “Self registration is fine, but we need validation.” Goldstein quipped: “25% of the entities that have registered an LEI are in the top 10 list of tax havens, so finding the correct documentation for them is really difficult.” On how the validation problem could be resolved, he said: “I think the COU should be required to make sure that somebody is validating the data, refreshing it and keeping it up to speed.”

While panel members agreed that the use case for LEIs is pretty limited right now, except in Europe where the LEI must be used for reporting under European Market Infrastructure Regulation, Jordan suggested that the COU, once in place, may work with regulators to promote use of the entity identifier.

Alvarez suggested that while some firms are taking a tactical approach to the LEI and are integrating it in their regulatory reporting workflows, it would only take one incident in the market for all firms to jump on the LEI bandwagon. The problem here, however, is a lack of LEI hierarchies, which means it is not yet possible to see a full picture of related entities.

Jordan added: “This is a journey and a global process. Firms are interested in the LEI partially because it is the first time a data set this valuable has been free – it is paid for by small surcharges on entities. The LEI suggests a novel business model including data that the industry is interested in. We have to walk before we can run, but once we have half a million or so entities registered, we can add immediate parents and push towards maintaining hierarchies. The LEI is already a big win for systemic risk regulators around the world.”

On the final question of what advice panel members would give to data practitioners integrating the LEI, both Goldstein and Alvarez said they should outsource the problem to a data vendor. Preiss concluded: “Become a leader within your organisation and understand how the LEI will need to be used. For some financial firms the LEI makes no sense at the moment, but all firms need a coherent view of the LEI and everyone needs to understand that view.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Best practice approaches to data management for regulatory reporting

13 May 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Effective regulatory reporting requires firms to manage vast amounts of data across multiple systems, regions, and regulatory jurisdictions. With increasing scrutiny from regulators and the rising complexity of financial instruments, the need for a streamlined and strategic approach to data management...

BLOG

How RegTech has Shaped Compliance in a Year of Global Regulatory Changes

2024 has been a transformative year for the regulatory landscape marked by major updates to trade reporting rules across the globe. Leo Labeis, CEO of REGnosys, reflects on the year and discusses how firms can harness RegTech solutions to stay prepared for ongoing regulatory evolution. While some will remember 2024 as “the year the world...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...