About a-team Marketing Services

A-Team Insight Blogs

ISO Standards for Legal Entity Data are More Important Than Those for Corporate Actions, Says PaceMetrics’ Giblin

Subscribe to our newsletter

The development of ISO standards for legal entity data should be prioritised over the push to get the industry to adopt ISO 20022 within the corporate actions world, according to Gerry Giblin, CEO of EDM vendor PaceMetrics. In the fifth in our series of talking heads on the challenges surrounding entity data management in the current market (see the previous one with Thomson Reuters’ Tim Rice), Giblin explains why the development of entity identification standards is critical for the market and why this must be done in a global context.

PaceMetrics has been targeting its own EDM solution, which initially focused on the pricing and valuations data management segment, at the wider reference data management space, including entity data management. Giblin, who became CEO back in 2005, spoke to Reference Data Review back in May last year about why he reckons the combination of PaceMetrics’ lean data model and process driven EDM approach sets the vendor apart from its competition. Here, he discusses the wider industry drivers for investment in managing entity data and his concerns around the development of a new entity identification standard.

How has the regulatory attention directed at the legal entity identification challenge in the post-Lehman environment impacted financial institutions’ practices with regards to this data?

In the post-crisis world, managing legal entity data has turned into one of the biggest challenges for financial institutions.

Most institutions have purpose-based heavily siloed systems, each requiring their own entity data. There is no ISO format on entity data, which in my opinion is far more important than the rather academic consideration that went into ISO 15022 and ISO 20022. This is simply because it is promoted by both DTCC and Swift. Corporate actions can always be managed manually, which of course creates inefficiencies and cost issues. However, legal entity data impacts directly on a financial institution’s performance as a ‘risk broker’. It affects the company’s credit assessment and exposure, not just the drive for efficiency.

Many financial institutions are considering building an entity master, but the security of the system is debatable. Adding one more silo to the rest, which will mostly fit one business purpose only, does not constitute an ideal solution. Providers such as Avox are also being considered (and wisely so) – but again, the most important thing is being in a position to monitor, centrally update and review the entity data.

Which regulations and compliance requirements are having the biggest impact on this area?

Regulations such as MiFID, the Transparency Directive, Basel III, Anti-Money Laundering (AML) and Know Your Customer (KYC) – are having the greatest impact, both in Europe and the US.

Given there is currently no industry standard legal entity identifier and the US regulator is looking at mandating its introduction as part of the Office of Financial Research, what impact will this likely have on the US market? And the rest of the world?

Regulation in the US and Europe is driving the adoption of a global identifier, the OFR in particular is expediting these developments. Companies such as Avox are also pushing for an industry standard. The issue thus far, however, is that the various standards developed are suitable for domestic, and not international, use. Unless the US regulators come up with a standard legal entity identifier that is accepted internationally, there will be no global adoption, nor will it affect anyone outside the US.

A number of options are on the table for such an identifier – Swift’s BIC, the S&P/Avox Cabre, a version of ISO’s IGI – what is your feeling for which will be selected as the most appropriate option and why?

I think that none of the above is likely to be chosen as the standard. It is most likely that a new standard will be initiated, with resulting complications to already complex systems. Having said that, Avox Cabre seems better positioned, compared to the others. Perhaps the next Basel will aim to establish a global identifier, since this impacts risk reporting.

How have counterparty risk management concerns impacted the underlying data management systems within systemically important financial institutions? What level of maturity is the market at with regards to the management of this data?

These concerns are typical of the wider concern about data quality and governance. Major investments are underway or planned in all the large institutions to deliver state of the art process centric data management solutions. Counterparty data is seen as just another type of data that needs to be supported.

Are firms largely opting for a centralised approach towards dealing with this data or are the vertical silos across the different parts of an institution persisting?

What we see is that ‘just in case’ large, centralised databases of vendor data are giving way to ‘just in time’ process driven data management. Using this approach, data delivery to the required processes is a key part of the overall data management strategy and only the data required by downstream systems is collected and cleansed.

At the same time, if the challenge is presented to IT, localised warehouses will be created with a so called enterprise-wide function. In practice, however, these are mostly only fit for one business purpose.

Is there a degree of disparity in these practices between the buy side and the sell side? Large and small firms?

The large buy side firms are just behind the sell side, while smaller firms are lagging behind. We also expect to see more disparity between IT led and business cultured organisations.

What trends do you expect to see over 2011 in terms of market practices in this space?

What we expect to see is that firms will increasingly accept that data management should be defined by business process rather than being database led. A process driven approach will enable financial institutions to reduce costs because of the leaner infrastructure and also save time, as the solution will be tailored to their unique data requirements.

Subscribe to our newsletter

Related content


Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...


Duco Acquires Unstructured Data Management Specialist Metamaze

Duco, a provider of SaaS AI-powered data automation, has acquired Metamaze, an Antwerp, Belgium-based company offering an AI-driven intelligent document processing SaaS platform that automatically processes, extracts and interprets information from any type of unstructured document. By combining the Metamaze and Duco platforms, customers can ingest any type of data from any type of document...


TradingTech Summit London

Now in its 13th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.


Regulatory Data Handbook 2023 – Eleventh Edition

Welcome to the eleventh edition of A-Team Group’s Regulatory Data Handbook, a popular publication that covers new regulations in capital markets, tracks regulatory change, and provides advice on the data, data management and implementation requirements of more than 30 regulations across UK, European, US and Asia-Pacific capital markets. This edition of the handbook includes new...