About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Despite Lack of Standards, Legal Entity Data Will Become Centre of Data Operations

Subscribe to our newsletter

Although standards for legal entity data are likely to evolve over time and become more rigorous, there does not exist today a standard directory of identifiers for legal entities across global jurisdictions, said James Redfern, head of sales and marketing at CounterpartyLink. But with an average of 27% of company records held at financial institutions deemed inaccurate, firms need to figure out how to fix these problems and then continue to maintain the database in the absence of any industry standard, particularly in current conditions, he suggested.

Redfern said, “The entity is the key element in the middle; it will become the centre of data operations.” Key to managing entity data is getting the linkages right, he said, referring to both the linking of entity data, which can quickly become complex, but also linking of disparate sources to gather that information, be that the registration authorities, regulators, exchanges, or other sources. Said Redfern, “But the linkages are rendered worthless if the data it is linked to is inaccurate or not fit for purpose.”

He promoted CounterpartyLink’s Client Data Audit Report as a useful independent auditing service that could be used within business cases for senior management. But the audit can also be very useful for helping to prioritise cleansing and maintenance work, for example prioritising the higher risk entities over those with lower risk or less exposure.

Through conducting such audits for clients, Redfern said that the most common areas for data impurities were: ownership (12%), company name (8%), registered address/headquarters (7%), regulator (6%), registration (5%), and identifiers (4%).

At least one senior member of the US Federal Reserve had highlighted entity data as a key ‘broken’ factor in risk assessment, perhaps indicating a likelihood of further examination of the issue and potential regulation down the line. But as Redfern pointed out, “It is beneficiary to have standards, but business will continue without them.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Augmented data quality: Leveraging AI, machine learning and automation to build trust in your data

Artificial intelligence and machine learning are empowering financial institutions to get more from their data. By augmenting traditional data processes with these new technologies, organisations can automate the detection and mitigation of data issues and errors before they become entrenched in workflows. In this webinar, leading proponents of augmented data quality (ADQ) will examine how...

BLOG

Leaders Scrutinise a Changing Industry at A-Team Group’s Annual Data Management Summit New York City

Experts and executives from across the financial data ecosystem gathered at A-Team Group’s Data Management Summit New York 2025 last week to discuss and probe the latest innovations, trends and strategies in our fast-moving industry. From data quality and artificial intelligence agents to modern data architectures and data products, a multitude of current topics were...

EVENT

AI in Data Management Summit New York City

Following the success of the 15th Data Management Summit NYC, A-Team Group are excited to announce our new event: AI in Data Management Summit NYC!

GUIDE

Corporate Actions Europe 2010

The European corporate actions market could be the stage of some pretty heavy duty discussions regarding standards going forward, particularly with regards to the adoption of both XBRL tagging and ISO 20022 messaging. The region’s issuer community, for one, is not going to be easy to convince of the benefits of XBRL tags, given the...