The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit: Managing Entity Data with the Global LEI

The global Legal Entity Identifier (LEI) system is growing in terms of the number of operational pre-Local Operating Units (LOUs) within the system and the number of LEIs they have issued, but it remains incomplete and continues to pose data management problems without a Central Operating Unit (COU).

The status of the global LEI system and the use of LEIs to manage entity data was discussed during a panel session at last week’s A-Team Group Data Management Summit in New York. Contributing to the session, Beyond Dodd-Frank: Managing Entity Data with the Global LEI, were experts Marc Alvarez, senior director of reference data infrastructure at Interactive Data Corporation; Steve Goldstein, co-founder, chairman and CEO at Alacra; Scott Preiss, vice president and chief operating officer at Cusip Global Services, and Ron Jordan, managing director and chief data officer of data services at DTCC.

Moderator Sarah Underwood, editor at A-Team Group, set the scene with a quick review of the development of the global LEI system before asking panel members to comment on its status and efficacy. Jordan noted that 265,000 LEIs have been issued globally, with 65% covering European entities and 27% covering US entities. Preiss added statistics on pre-LOUs, stating that 14 are in action and more are planning to join the system. He also mentioned a letter from SIFMA encouraging greater adoption of the LEI in the US.

Turning to how the system is working, Alvarez said: “I think we are off to a good start. The LEI is not just about a standard for issuing 20-character codes, but also about meta data that describes information such as entity location. LEIs are also different because they work in a federated peer-to-peer model, which means there is a lot to reconcile, but if we get reconciliation right, we will introduce multiple levels of accuracy. We are broadly optimistic about the LEI and would like to go further with it, but these are early days.”

Jordan added: “The Regulatory Oversight Committee of the global system includes 65 regulators and is overseeing developments until the COU is formed. It’s tough work as a federated model has pros and cons. Another challenge is politics, as pre-LOUs in different parts of the world operate under different jurisdictions. How do we balance the needs of regulators who want a global system with those of local data practitioners who want to format data according to their own local systems? A common data file format is being published for all pre-LOUs to use and it should help in creating a consolidated LEI file. I think it is the COU’s job to create the consolidated file, but the COU has not yet been formed so we don’t know if that will happen.”

Picking up on the federated element of the system, Preiss commented: “There is history showing how a federated model can work. The Association of National Numbering Agencies (ANNA) works on a local and global level generating ISINs and related data. This holds great promise for how the LEI could work.”

A quick show of hands in the audience demonstrated few firms using the LEI as yet and those few struggling with poor data quality. On this issue, Goldstein said: “The problem is caused by setting up the COU after the fact. This is like playing basketball

with no referee and then calling out the fouls after the game. There are challenges around the accuracy of some of the data and some records don’t seem to be validated at all. Funds, sub-funds, master funds and so on are each handled differently by different pre-LOUs.”

Jordan pointed out that part of the data quality problem is 20% to 50% of entities getting the details of their own information wrong in the first place. He commented: “Self registration is fine, but we need validation.” Goldstein quipped: “25% of the entities that have registered an LEI are in the top 10 list of tax havens, so finding the correct documentation for them is really difficult.” On how the validation problem could be resolved, he said: “I think the COU should be required to make sure that somebody is validating the data, refreshing it and keeping it up to speed.”

While panel members agreed that the use case for LEIs is pretty limited right now, except in Europe where the LEI must be used for reporting under European Market Infrastructure Regulation, Jordan suggested that the COU, once in place, may work with regulators to promote use of the entity identifier.

Alvarez suggested that while some firms are taking a tactical approach to the LEI and are integrating it in their regulatory reporting workflows, it would only take one incident in the market for all firms to jump on the LEI bandwagon. The problem here, however, is a lack of LEI hierarchies, which means it is not yet possible to see a full picture of related entities.

Jordan added: “This is a journey and a global process. Firms are interested in the LEI partially because it is the first time a data set this valuable has been free – it is paid for by small surcharges on entities. The LEI suggests a novel business model including data that the industry is interested in. We have to walk before we can run, but once we have half a million or so entities registered, we can add immediate parents and push towards maintaining hierarchies. The LEI is already a big win for systemic risk regulators around the world.”

On the final question of what advice panel members would give to data practitioners integrating the LEI, both Goldstein and Alvarez said they should outsource the problem to a data vendor. Preiss concluded: “Become a leader within your organisation and understand how the LEI will need to be used. For some financial firms the LEI makes no sense at the moment, but all firms need a coherent view of the LEI and everyone needs to understand that view.”

Related content

WEBINAR

Recorded Webinar: Entity identification and client lifecycle management – How financial institutions can drive $4 billion in cost savings

A new model in Legal Entity Identifier (LEI) issuance has created significant opportunities for financial institutions to capitalise on their KYC and AML due diligence. By becoming Validation Agents and obtaining LEIs on behalf of their clients, financial institutions can enhance their client onboarding experience, streamline their internal operations, and open the door to new,...

BLOG

Bank of England Progresses Data Standards Plan to Improve Data Collection at Lower Cost to Industry

The Bank of England is pressing ahead with plans to deliver data standards for data collection based on its vision that ‘the bank gets the data it needs to fulfil its mission, at the lowest possible cost to industry’. It acknowledges the challenges of setting standards, such as coordinating collective action, and says it will...

EVENT

Data Management Summit Virtual

The Data Management Summit Virtual will bring together the global data management community to share lessons learned, best practice guidance and latest innovations to emerge from the recent crisis. Join us online to hear from leading data practitioners and innovators from the UK, US and Europe who will share insights into how they are pushing the boundaries with data to deliver value with flexible but resilient data driven strategies.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...