About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

GoldenSource Sets out LEI Store and Suggests an Entity Data Utility to Commoditise Compliance

Subscribe to our newsletter

GoldenSource is addressing the arrival of legal entity identifiers (LEIs) with its existing enterprise data management (EDM) software and the suggestion of a data utility that could commoditise compliance with regulations based on the LEI.

The company has made its software LEI ready by allowing users to add LEI codes as additional or primary entity identifiers and by extending its Connections managed service to include LEI information from market data vendors. Existing GoldenSource users will not need to upgrade their software and new customers will have immediate access to LEI support.

Neill Vanlint, managing director of GoldenSource EMEA and Asia, explains: “Cross-referencing identifiers to one identifier code has always been one of our core value propositions, so for us, the LEI is another cross-referencing exercise. Momentum is building behind the LEI and we can provide a bridge from the old fragmented world to the new world of the LEI by mapping existing identifiers to the LEI.”

Vanlint notes an uptick in demand for counterparty data consolidation and on-boarding solutions as a result of the emerging LEI, and says broker-dealers are embracing the LEI standard to build Know Your Customer (KYC) systems. Universal banks, meanwhile, are developing LEI programmes not only for compliance reasons, but also to increase profitability and margins through improved risk and capital management.

“In universal banks with multiple businesses, it tends to be the capital markets business that is leading the charge, selling the idea of a shared entity data service internally in exchange for funding. A centralised, shared data service allows all business lines to use the same entity data and shares costs,” he says.

Getting rid of duplicated data can be the biggest effort involved in building a shared entity data service and there are no short cuts. Vanlint’s experience is that customers take a hybrid approach to solving the problem by tackling a couple of iterations of data cleansing themselves before bringing in a company such as Avox to complete the task. The LEI, he suggests, will ease the de-duplicating problem caused by the many textual fields in previous identifiers as it will be based on a standardised code with mandated hierarchy and relationship data.

GoldenSource does not promote a big bang approach to implementing the LEI, preferring a transitional approach from internal proprietary identifiers, to pre-LEIs – such as the German LEI issued by WM Datenservice on behalf of the German regulator Bundesanstalt fuer Finanzdienstleistungsaufsicht and the CFTC Interim Compliant Identifier (CICI) issued by DTCC on behalf of the US Commodity Futures Trading Commission – and on to the LEI. Proprietary identifiers and pre-LEIs will be mapped to LEIs as they emerge, eventually tipping the balance of identifiers in favour of LEIs.

Turning to the proposition of commoditised compliance, Vanlint says: “Many firms do the same things with data and could integrate with a shared data utility once the LEI arrives. I expect some large players to get together to share, de-duplicate and cleanse data. The cost of managing the data will be shared and it will be much lower than it is today.”

He suggests a utility could be agreed in principle by a group of large banks and that it would then need a disinterested operator, a platform and an authority to oversee its operation. The operator could be an exchange, clearing house or joint venture, and the platform could come from GoldenSource. Regional utilities could be a good start as local companies have the best data around their clients, but Vanlint does not rule out the ultimate possibility of a global utility.

He says: “The data is public, but in many places, and all firms are dealing with it, so there is a compelling case for a utility. Anyone trying to save costs must come to the conclusion that it is a good idea.” A utility would need to support an open and end-to-end workflow for client on-boarding and KYC, and would take in entity data from market data vendors as well as data such as credit risk and credit ratings to enrich the entity data. “A utility could house all the identifiers known to participants and map all of them to LEIs. The objective would be to move towards all identifiers being LEIs, although that will take some years,” says Vanlint.

GoldenSource is bullish about its ability to underpin the platform of an LEI data utility and is keen to talk to firms that are interested in the concept. Vanlint explains: “GoldenSource has evolved a data model that can correlate fragmented data and complex hierarchies, and produce consistent data. The software also has a workflow component, the capability to map to many content providers and an application programming interface to the outside world. It automates data publishing to specific systems and tailors workflow to make data available quickly to consuming systems.”

While the concept of a data utility for commoditised compliance remains on the table, the reality of regulation that uses the LEI, such as European Market Infrastructure Regulation (EMIR) that comes into force in September 2013, is expected to boost business at GoldenSource.

“The LEI will change operational processes. In the past, both risk and compliance departments have owned and operated systems for functions such as client on-boarding. With a shared data service they will both be shareholders in the data management process. From an operational point of view this is a major shift and from a business perspective data management efficiency will help achieve cost savings. As firms understand that a shared service is the way to go, they will beat a path to our door,” concludes Vanlint.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to automate entity data management and due diligence to ensure efficiency, accuracy and compliance

Requesting, gathering, analysing and monitoring customer, vendor and partner entity data is time consuming, and often a tedious manual process. This can slow down customer relationships and expose financial institutions to risk from inaccurate, incomplete or outdated data – but there are solutions to these problems. This webinar will consider the challenges of sourcing and...

BLOG

Kensho Integrates Link AI with S&P Cross-Reference Capabilities to Improve Client Data Management

Kensho Technologies, an S&P Global company, has integrated its Link AI solution with S&P Global Market Intelligence’s Business Entity Cross Reference Service (BECRS) to streamline client data management. By combining Kensho Link, a machine learning service that maps entities in a user’s database to unique ID numbers from S&P Global’s company database, and BECRS datasets...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...