About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

T+2 = Data Standardisation on the Agenda for Europe

Subscribe to our newsletter

The gradual move towards T+2 settlement in Europe, ahead of the planned launch of the European Central Bank’s (ECB) Target2-Securities (T2S) platform in September 2014, is further raising the profile of data standards in areas such as standing settlement instructions (SSIs), legal entity identification and instrument identification. A recent webinar organised by post-trade specialist Omgeo highlighted the concerns that buy side and sell side firms have with regards to the move from T+3 to T+2 settlement and data standardisation emerged as a key precondition for this move.

A-Team Insight has been keeping a close eye on the developments regarding the T2S project over the last couple of years (see our roundup of its data management impacts from back in 2009) and although much has been said publicly about the impact of changes to the settlement environment on the corporate actions space, very few industry meetings have covered the impact on the wider reference data space. It was therefore reassuring to hear speakers including James Cunningham, vice president of European Market and Regulatory Initiatives at BNY Mellon, Terry van Praagh, senior vice president of Investment Operations Outsourcing for EMEA at Northern Trust, and John Gubert, chairman of the International Securities Market Advisory Group (ISMAG), discussing the prerequisite for greater standardisation of reference data ahead of any changes to settlement cycles.

Obviously, given the wall of regulation facing the financial services industry as a whole at the moment, a move to T+2 over the next few years is not at the top of the priority list in terms of investment (as noted by panellists during the webinar). NTRS’ van Praagh highlighted the numerous drains on buy side firms’ resources as a result of this barrage of regulatory requirements.

However, all panellists agreed that there is a valid argument behind the push towards T+2, even if it will put a significant amount of pressure on custodians and investment managers in the short term. After all, T2S will require a reduction in settlement cycles across Europe and it is in keeping with the regulatory bent towards reducing counterparty risk within the market (although operational risk may be pushed up in the short term due to the higher risk of settlement failures happening during the transition period).

Cunningham noted that a move to T+2 is possible but a number of other items must be tackled first, including a move to T+1 settlement within the FX market. One of the headline items is also standardisation of reference data. Even if data is not a “sexy word”, as noted by van Praagh, it is an important one when attempting to move from old style batch settlement to a more real-time environment. Logic dictates, for example, that in order for greater automation to be achieved, standardised formats must first be applied.

ISMAG’s Gubert listed a number of data standardisation items on the ‘to do’ list to be achieved before T+2 settlement can be turned into a reality across Europe (a topic he has previously touched upon). As well as exploring the potential impact of increased data volumes on market infrastructures such as central securities depositories (CSDs), the industry will need to tackle SSI standards, client and counterparty identification and standardised instrument identification, he contended. A list that very much seems in keeping with developments happening in the US around the Office of Financial Research (OFR). This should also give the ECB’s data standards champion Francis Gross more ammunition for his campaign to keep a utility on the agenda.

As for the future of the move to T+2 settlement itself, the European Commission’s regulatory proposals were initially due to be published in June, but this is likely to slip by a few months to July or August. However, it is likely that these will not be finalised into a directive until mid-2012, at which point there will hopefully be some legislative certainty about deadlines. This gives the industry some breathing room, at least, to raise the standards issue during the consultation phase.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Strategies, tools and techniques to extract value from unstructured data

Unstructured data is voluminous, unwieldy and difficult to store and manage. For capital markets participants, it is also key to generating business insight, making better-informed decisions and delivering both internal and external value. Solving this dichotomy can be a challenge, but there are solutions designed to help financial institutions digest, manage and make best use...

BLOG

2024: A Year of Increasing Data Complexity

One thing was apparent in the data management space over the past year; the job of chief data officers became increasingly complex as the volume of data their organisations ingested swelled and the uses to which it was put expanded. All that despite the flowering of technologies with the potential to make life easier for...

EVENT

Future of Capital Markets Tech Summit: Buy AND Build, London

Buy AND Build: The Future of Capital Markets Technology London on September 19th at Marriott Hotel Canary Wharf London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...