About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Issuers Won’t Adopt Standards Without a Solid Business Case and Market-wide Agreement, Says Computershare’s Sarkar

Subscribe to our newsletter

The issuer community is wary of adopting new standards without a solid business case and before the rest of the industry has agreed on which standard is best, said Naz Sarkar, CEO of issuer agent firm Computershare Investor Services, at this month’s CorpActions 2011 conference. Sarkar defended the position of issuers, who are frequently criticised by the rest of the corporate actions community for proving to be a sticking point in increasing STP and market harmonisation, by arguing that if a standard has not reached tipping point in adoption across the industry at large, any investment in it could prove futile on the part of an issuer.

Sarkar, who is a frequent speaker on the part of his issuer clients, highlighted a particular incidence where his own firm worked with a custodian to implement an automated vote confirmation process last year, but the custodian ended up losing interest due to increasing costs half way through the project. “Any investment in automation or standards needs to be more than a fad,” he contended. “Where there’s real demand from the market for adoption, the issuer and issuer agent community will respond, but these projects need to bring efficiency and reduce risk for everyone in the market.”

Fellow issuer agent and company secretary at Equiniti Peter Swabey noted that issuer agents are the public face of the issuer community and, as such, must act in their stead with regards to discussions such as these. “It can be hard to establish a clear business case that appeals to all issuers unless there are clear benefits for all,” he said.

John Clayton, director of product management at Euroclear, added: “Harmonisation and standardisation are both the right thing to do for the market but somebody has to pay for them. Cost is a massive stumbling block when it comes to building a business case.”

In response, Sarkar referred to the high costs involved in certain standards focused Euroclear projects that issuer agents have been required to swallow over recent years, without a significant cost benefit for the issuers or their agents. “The market needs to look at itself and decide upon the best course of action before it comes to the issuers,” he said.

He suggested that a four step process should be followed by the industry in this regard. First, it should establish how much demand there is in the market for the standard and appetite for adoption, and then assess the risks involved and the benefits. Thirdly, it should articulate the efficiencies that could be gained, and lastly, examine the costs and the savings of the project.

Clayton noted, however, that the costs will not be the same for every issuer, as the adoption of ISO 20022 or XBRL, for example, will be much more costly for those that are not operating in an automated environment.

Sarkar suggested that the argument for XBRL has not been articulated properly to the market yet (an issue that Swift and the Depository Trust and Clearing Corporation are working on) and there is no consensus on the adoption of the new standard. He added: “Basic discussions around the additional level of risk in terms of the dissemination and reformatting of this data need to happen first.”

Swabey added that the manual rekeying of data into the XBRL format could therefore prove an extra risk for the issuer or its agent, as well as an extra cost. “This isn’t helped by the fact that not every domestic market wants to use message fields in the same way and market conventions vary across regions,” he said.

Sarkar suggested that this political agreement across markets about key data items and taxonomy needs to happen before any standard can have a chance of market-wide adoption. A data dictionary for corporate actions is therefore a first step in achieving this market certainty and gaining issuer and issuer agent buy in.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to organise, integrate and structure data for successful AI

Artificial intelligence (AI) is increasingly being rolled out across financial institutions, being put to work in applications that are transforming everything from back-office data management to front-office trading platforms. The potential for AI to bring further cost-savings and operational gains are limited only by the imaginations of individual organisations. What they all require to achieve...

BLOG

Rethinking Data Management in Financial Services: Virtualisation Over Static Storage

By Thomas McHugh, Co-Founder and Chief Executive, FINBOURNE Technology. In Financial Services (FS), data management has long been centred around traditional database storage. However, this approach is fundamentally misaligned with the nature of FS data, which is process-driven rather than static. The industry needs a shift in perspective – one that prioritises virtualisation over rigid...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...