About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A Single Standard for Legal Entity Data May be a Long Way Off, Says Thomson Reuters’ Rice

Subscribe to our newsletter

The regulatory community may be keen for the industry to adopt standardised legal entity identifiers, but the adoption of a single suitable standard for these purposes may take some time. In the fourth in our series of talking heads on the challenges surrounding entity data management in the current market, Tim Rice, global head of pricing and reference data for data giant Thomson Reuters, explains why he thinks legal entity data mapping will persist for a long time to come.

Rice joined Thomson Reuters back in 2004 from Telerate to spearhead the development of Reuters’ DataScope reference data business segment. He now works with global head of Enterprise content Roseann Palmieri in the recently restructured Enterprise division, which brings together the vendor’s market and reference data solutions. Rice has been largely focused on joining the Enterprise platform up with its various data content sets in the post-merger Thomson Reuters environment.

How has the regulatory attention directed at the legal entity identification challenge in the post-Lehman environment impacted financial institutions’ practices with regards to this data?

The majority of financial market participants are no strangers to the need to manage market data to address regulatory requirements. The increased focus on regulation post 2008 has brought the management of legal entity data to the forefront, with a sense of urgency and a complexity that has significantly surpassed previous requirements. We are in an environment where new regulations continue to emerge, lot sizes are declining, issuance is growing across many asset classes and instruments are becoming even more complex; as a result, data, and the technology to normalise and manage that data, have become increasingly important.

In order to satisfy multiple regulatory requirements and have a clear understanding of the exact makeup of their downstream clients and portfolio holdings, investors are placing increased importance on standard naming and numbering conventions. This enables them to track complex, multiple holding company/subsidiary relationships, and create accurate linkages between legal entities and securities they are holding within their portfolios.

Which regulations and compliance requirements are having the biggest impact on this area?

It isn’t a single regulation or requirement, but rather the aggregate impact of all of them. The number of new regulations introduced as a result of recent legislation continue to multiply, KYC, Anti Money Laundering, Basel II, UCITS III, MiFID, and Dodd Frank all impact our clients and create a need for transparency and an understanding of aggregated portfolio holdings.

Although some regulations are more geographically specific than others, it is clear that financial market participants must embrace several best practices in terms of how they store and manage market, reference and legal entity data. Firms who wish to compete in the market that is emerging will be well served by creating a single central database of record, whereby they can maintain transparency and granularity across the portfolio, and understand the entire corporate structure and underlying holdings.

Given there is currently no industry standard legal entity identifier and the US regulator is looking at mandating its introduction as part of the OFR, what impact will this likely have on the US market? And the rest of the world?

When most institutions think about identifiers, they think well beyond legal entity data and look at understanding the exact makeup of their portfolio, including linking holdings and instruments to ultimate ownership. They go beyond the US market as well, as when one examines the corporate structure of most corporations, even if they are not global, they have global holdings. While it might be easy to say that issuance or underwriting could be impacted by this tightening of regulatory disclosure and reporting requirements, the push for alpha and continued need to prove shareholder value will continue to be the driving force.

Because the industry currently lacks a single legal entity identifier, financial institutions are increasingly building out their infrastructure to ensure that there is mapping, or a dictionary, to create linkages across common identifiers of the same legal entity. This is not dissimilar to how the use of instrument identifiers have evolved, in that investors have put the technology and mappings in place to link multiple identifiers in a way that can be integrated into and effectively used by their work streams.

A number of options are on the table for such an identifier – Swift’s BIC, the S&P/Avox Cabre, a version of ISO’s IGI etc – what is your feeling for which will be selected as the most appropriate option and why?

If we look at the history of symbology on the listed securities side, we’ve had 20 years of a standard called ISIN, yet many other proprietary identifiers (RIC, Sedol, Cusip, Valoren, Wertpapier) exist for the same body of instruments. Although increased regulation has put legal entity data into the spotlight, it is very early in its evolution. Legal entity data is far more fragmented than other more established datasets, for example equities, so it is unrealistic to think that a single industry standard will emerge any time soon.

In my experience, it is highly improbable that a system or identifier for any given business will emerge; each has their advantages and drawbacks, and often they are embedded into the systems of various clients such that they cannot be replaced, but rather mapped to one another. Our approach has been to provide this mapping among the various identifying options, in order to facilitate ease of use for our clients.

How have counterparty risk management concerns impacted the underlying data management systems within systemically important financial institutions? What level of maturity is the market at with regards to the management of this data?

Since 2008 there has been a spike in recognition for the need to manage legal entity data. Prior to this point, it was more about alpha, understanding the underlying risk in the portfolio and the resultant returns.

The market for aligning and managing this data is still highly fragmented; not only does significant database management work have to occur, but the mappings among legal entities and instruments need to happen as well.

Are firms largely opting for a centralised approach towards dealing with this data or are the vertical silos across the different parts of an institution persisting?

The recommended framework for a large organisation is to go beyond the siloed approach and create a central repository for data; all of the downstream applications feed from this central database.

We often see our clients derive significant benefit from the creation of a single database of record for the organisation– all the downstream systems can then feed from that single database of record, the organisation authority (OA)

Is there a degree of disparity in these practices between the buy side and the sell side? Large and small firms?

The buy side and the sell side have very similar needs, in terms of having to know the exact makeup of their portfolios, and the relationship in terms of corporate hierarchies. What is different are the workflows, and how they use these particular datasets. What also may be different are their relative sizes – the buy side players may often be smaller than some of the larger dealer/sell side players, which face significant integration issues. Sell side players, as well as smaller players, on either side of the market often seek out vertically integrated turnkey solutions, so as not to have to do all the mapping and data integration work and therefore incur the costs.

What trends do you expect to see over 2011 in terms of market practices in this space?

There is some question as to whether we will continue to see increased regulation or whether it will level off. It is fair to say that we’re all hoping for some return to normalcy, which is not unreasonable given market cycles and continued cost pressure. We have also seen more granularity in lot sizes, and increased transaction volume in certain asset classes, which suggests that our clients have returned to seeking returns for their shareholders, and begun to put the data management infrastructure in place to accomplish their objectives.

Subscribe to our newsletter

Related content


Recorded Webinar: Best practice approaches to trade surveillance for market abuse

Breaches of market abuse regulation can lead to reputational damage, eye-watering fines and, ultimately, custodial sentences of up to 10 years. Internally, market abuse triggers scrutiny of traders and trading behaviours; externally it can undermine confidence in markets and cause financial instability. This webinar will discuss market abuse of different types, such as insider trading...


Snowflake Reaches Agreement to Acquire TruEra AI Observability Platform

Snowflake has reached a definitive agreement to acquire Redwood City, California-based TruEra and its AI observability platform. The acquisition is expected to bring large language model (LLM) and machine learning (ML) observability to the company’s AI data cloud, helping users demonstrate that AI is both trustworthy and high performing. Snowflake has been investing in GenAI...


TradingTech Summit London

Now in its 13th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.


The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...