About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

ISO Standards for Legal Entity Data are More Important Than Those for Corporate Actions, Says PaceMetrics’ Giblin

Subscribe to our newsletter

The development of ISO standards for legal entity data should be prioritised over the push to get the industry to adopt ISO 20022 within the corporate actions world, according to Gerry Giblin, CEO of EDM vendor PaceMetrics. In the fifth in our series of talking heads on the challenges surrounding entity data management in the current market (see the previous one with Thomson Reuters’ Tim Rice), Giblin explains why the development of entity identification standards is critical for the market and why this must be done in a global context.

PaceMetrics has been targeting its own EDM solution, which initially focused on the pricing and valuations data management segment, at the wider reference data management space, including entity data management. Giblin, who became CEO back in 2005, spoke to Reference Data Review back in May last year about why he reckons the combination of PaceMetrics’ lean data model and process driven EDM approach sets the vendor apart from its competition. Here, he discusses the wider industry drivers for investment in managing entity data and his concerns around the development of a new entity identification standard.

How has the regulatory attention directed at the legal entity identification challenge in the post-Lehman environment impacted financial institutions’ practices with regards to this data?

In the post-crisis world, managing legal entity data has turned into one of the biggest challenges for financial institutions.

Most institutions have purpose-based heavily siloed systems, each requiring their own entity data. There is no ISO format on entity data, which in my opinion is far more important than the rather academic consideration that went into ISO 15022 and ISO 20022. This is simply because it is promoted by both DTCC and Swift. Corporate actions can always be managed manually, which of course creates inefficiencies and cost issues. However, legal entity data impacts directly on a financial institution’s performance as a ‘risk broker’. It affects the company’s credit assessment and exposure, not just the drive for efficiency.

Many financial institutions are considering building an entity master, but the security of the system is debatable. Adding one more silo to the rest, which will mostly fit one business purpose only, does not constitute an ideal solution. Providers such as Avox are also being considered (and wisely so) – but again, the most important thing is being in a position to monitor, centrally update and review the entity data.

Which regulations and compliance requirements are having the biggest impact on this area?

Regulations such as MiFID, the Transparency Directive, Basel III, Anti-Money Laundering (AML) and Know Your Customer (KYC) – are having the greatest impact, both in Europe and the US.

Given there is currently no industry standard legal entity identifier and the US regulator is looking at mandating its introduction as part of the Office of Financial Research, what impact will this likely have on the US market? And the rest of the world?

Regulation in the US and Europe is driving the adoption of a global identifier, the OFR in particular is expediting these developments. Companies such as Avox are also pushing for an industry standard. The issue thus far, however, is that the various standards developed are suitable for domestic, and not international, use. Unless the US regulators come up with a standard legal entity identifier that is accepted internationally, there will be no global adoption, nor will it affect anyone outside the US.

A number of options are on the table for such an identifier – Swift’s BIC, the S&P/Avox Cabre, a version of ISO’s IGI – what is your feeling for which will be selected as the most appropriate option and why?

I think that none of the above is likely to be chosen as the standard. It is most likely that a new standard will be initiated, with resulting complications to already complex systems. Having said that, Avox Cabre seems better positioned, compared to the others. Perhaps the next Basel will aim to establish a global identifier, since this impacts risk reporting.

How have counterparty risk management concerns impacted the underlying data management systems within systemically important financial institutions? What level of maturity is the market at with regards to the management of this data?

These concerns are typical of the wider concern about data quality and governance. Major investments are underway or planned in all the large institutions to deliver state of the art process centric data management solutions. Counterparty data is seen as just another type of data that needs to be supported.

Are firms largely opting for a centralised approach towards dealing with this data or are the vertical silos across the different parts of an institution persisting?

What we see is that ‘just in case’ large, centralised databases of vendor data are giving way to ‘just in time’ process driven data management. Using this approach, data delivery to the required processes is a key part of the overall data management strategy and only the data required by downstream systems is collected and cleansed.

At the same time, if the challenge is presented to IT, localised warehouses will be created with a so called enterprise-wide function. In practice, however, these are mostly only fit for one business purpose.

Is there a degree of disparity in these practices between the buy side and the sell side? Large and small firms?

The large buy side firms are just behind the sell side, while smaller firms are lagging behind. We also expect to see more disparity between IT led and business cultured organisations.

What trends do you expect to see over 2011 in terms of market practices in this space?

What we expect to see is that firms will increasingly accept that data management should be defined by business process rather than being database led. A process driven approach will enable financial institutions to reduce costs because of the leaner infrastructure and also save time, as the solution will be tailored to their unique data requirements.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Approaches to ESG data for analytics

Volumes of ESG data are huge and continue to grow, questioning how financial institutions with a focus on ESG investing can continuously capture and contain required data sets, master and integrate the data, and ensure data quality for meaningful analytics. This webinar will consider approaches to ESG data and data management for analytics, the challenges...

BLOG

Can the RegTech Segment Shrug Off the Collapse of SVB and its Collateral Damage?

The collapse of Silicon Valley Bank (SVB) and the resulting damage to confidence in the banking sector – which may have sparked the unravelling of Credit Suisse – has roiled markets globally. It’s also led to fears about the future of the FinTech and RegTech suppliers. With many suppliers in the California tech space and...

EVENT

Regulatory Reporting Briefing, London

RegTech Insight (from A-Team Group) is proud to announce the launch of its Regulatory Reporting Briefing taking place in London and focusing on: Preparing for the EMIR Re-Fit

GUIDE

Regulatory Data Handbook 2022/2023 – Tenth Edition

Welcome to the tenth edition of A-Team Group’s Regulatory Data Handbook, a publication that has tracked new regulations, amendments, implementation and data management requirements as regulatory change has impacted global capital markets participants over the past 10 years. This edition of the handbook includes new regulations and highlights some of the major regulatory interventions challenging...