Progress in the development of a global legal entity identifier (LEI) system, including the issue of 83,000 pre-LEIs by two active pre-Local Operating Units (LOUs), is acting as a catalyst for change with many financial institutions investing heavily to get their entity data in order. The process should prove beneficial in terms of improved data accuracy, better customer relationships and a clearer picture of risk, but it is not without challenges as firms consider how to manage existing data, add LEI entity data and achieve and sustain data quality. Then there is the catch that final standards for the global LEI system have yet to be detailed by the LEI Regulatory Oversight Committee (ROC).
These issues and more were debated this week during a lively A-Team Group webinar, The Global LEI and Entity Data Quality, hosted by A-Team Group Editor-in-Chief Andrew Delaney and followed closely on the heels of two statements from the ROC. The first notes the addition of two pre-LOUs – the London Stock Exchange sponsored by the UK Financial Conduct Authority and the Dutch Chamber of Commerce sponsored by the Netherlands Authority for the financial Markets – bringing the total number of pre-LOUs recognised by the ROC to nine. While only DTCC/Swift, working on behalf of the US Commodity Futures Trading Commission (CFTC), and WM Datenservice, working on behalf of Germany’s Bundesanstalt für Finanzdienstleistungsaufsicht, are already issuing LEIs, others are expected to follow shortly with the London Stock Exchange likely to come onstream within weeks.
The ROC’s second statement reports agreement on how the LEI Foundation, which will support the Central Operating Unit (COU), should be governed, and notes processes for endorsement and conditional endorsement of pre-LOUs and pre-LEIs by the ROC. The ROC will release details on how to achieve endorsement by the end of June and expects to make its first endorsement decisions this summer.
Delaney was joined in the webinar by Peter Warms, head of product development for global data and symbology at Bloomberg Data Solutions; Stuart Harvey, director at Datactics; and Stephen Engdahl, ssenior vice president of product strategy at GoldenSource. His initial question covered the state of play in development of the global LEI system. Warms noted progress made by the ROC and suggested the COU initially planned for this year is more likely to emerge early next year and will be critical in data quality as one of its roles will be to manage links between LOUs. He added: “Many of our clients have carved out a place for LEIs in their databases and some are developing entity databases for the first time. Spending on data management next year will focus on the LEI and regulatory requirements.”
Harvey commented: “We are spending time with clients at the coal face, taking in new data from source. As it is new data it is good quality, but firms need to understand where entity data lives and breathes in client systems and reconsider data in geographic and asset class silos. This is an opportunity to re-engineer data management systems and get ready for the LEI as well as regulatory reporting requirements.”
Turning to data quality, Delaney asked the webinar participants whether problems were being posed by pre-LOUs and pre-LEIs. Harvey cited reports of 600 duplicate entities in the DTCC/Swift database of pre-LEI CFTC Interim Compliant Identifiers, or CICIs, but said the duplicate entries make up less than 1% of the total registered entities and are not a major challenge to data quality.
Engdahl dissected the problem of disconnect between multiple LOUs, explaining: “The process for LOUs should make sure duplicate LEIs are not issued, but without central coordination there is a risk that a single entity could be given LEIs from multiple LOUs. With a small number of LOUs, this can be coordinated, but it will be difficult when more LOUs issue LEIs.” Engdahl also suggested it would be difficult for financial institutions to build their own LEI datasets and instead expects them to source datasets from data vendors that map LEIs to existing identifiers. This will leave firms with the task of data maintenance, about which he commented: “This is not a one-time effort to get data into a database, the need is for all LEI information to be kept up to date.”
Taking one of a number of questions from the webinar audience, Delaney asked the panel to forecast the number of entities that will be covered by pre-LEIs by the end of this year. Warms responded saying that with about 83,000 LEIs already issued, he expects the total to reach about 100,000 by the end of the year.
Returning to the issue of data quality, Harvey outlined the need to update systems and tackle challenges including data formats, normalisation and multiple language and character sets. Warms picked up Engdahl’s point on the potential problems that could be caused by multiple LOUs, saying: “The establishment of the COU is critical to LOU coordination. We are putting LEI data into our databases, but the more active LOUs there are with their own local needs the harder it becomes to extrapolate across all of them. Our fear is that if there is a flawed data model with different fields and no way to consolidate LEI data then it may be necessary for us to map to pre-LEIs from all nine pre-LOUs.”
While the ROC has yet to provide guidance on data consolidation, which is likely to be within the remit of the COU, Warms noted that the DTCC/Swift and WM Datenservice pre-LOUs are working on consolidation.
Turning to the initial concept of the LEI as a means of managing risk, Delaney raised the issue of the LEI not being a solution in itself, with the solution being in how it is used. Engdahl said: “Firms are investing in cleaning data and regulatory compliance, but how will they get a return on investment from standardising and aggregating data? The need is to use the LEI to support better risk and exposure practices, and deliver more timely and accurate information. If data is aggregated in firms and across the industry all firms will have a better picture of customer relationships and will be able to be more competitive.”
Harvey added: “The difficulty and cost of implementing the LEI depends on how robust client systems are. Some firms will be able to just add the additional data and others will have to do a lot of work to reengineer data systems, but it will be worthwhile as it will be possible to aggregate data to deliver better intra-day risk reports and improve analytics.”
Summing up progress in the development of the global LEI system and its prospects, Warms concluded: “My forecast is that critical entities will be registered and issued with LEIs this year. The system will then expand beyond today’s identifiers that are neither global nor comprehensive, but until the LEI is global and widespread it can’t be a key identifier. This will take time, but much has been done since the LEI initiative started 18 months ago.”
A-Team Group will be back with more reports on the global LEI system as it is rolled out. We are also working on an LEI KnowledgeHub that will be up and running in September with answers to all your questions on how to implement and use LEIs. Meantime, you can download special reports on this hot topic here.