Last week’s ISITC Europe meeting indicated that before any action can be taken to move to higher levels of electronic trade confirmation across the industry, a number of data standardisation challenges must be tackled first. Swift’s Fabian Vandenreydt, who was promoted to the position of head of Securities and Treasury Markets earlier this year and is in charge of the network operator’s reference data initiatives, highlighted the data quality issues in the instrument and entity identification space, noting if you “automate chaos, you just get faster chaos”.
Vandenreydt referred directly to the developments going on around the US Office of Financial Research (OFR) as something to watch with regards to tackling these underlying reference data challenges. Obviously, Swift is pitching itself as a European friendly registration authority for the OFR, so Vandenreydt’c comments should be viewed in this light, but he championed the idea of Europe and the rest of the world taking coordinated action on data standardisation before moving on to tackle items such as trade confirmations further downstream.
The London Stock Exchange’s (LSE) head of post-trade services Kevin Milne added that in light of ‘flash crash’ last year, participants in the financial markets are much more interested in finding out who is trading in which instruments. Legal entity identification is therefore very important in this environment due to the need for a forensic audit trail, which could be facilitated by increased standardisation of identifiers and greater automation of the trade confirmation process.
FIX Protocol’s global steering committee co-chair Jim Kaye, who is also product development manager for European execution services at Bank of America Merrill Lynch, noted that the Know Your Customer (KYC) regulations already in place mean that firms have to know who they are trading with, so internal codes for these counterparties are already available. It is only at a pan-industry level where this data is causing a significant problem with regards to risk tracking, he indicated.
However, Vandenreydt and his fellow panellists noted that the Europe is, in general (the UK being a notable exception), taking a far less prescriptive approach to regulation than the US and this may pose some problems with regards to global standardisation. Setting mandates at a global level is not easy if regulators do not have the teeth to follow up with regards to non-compliance. Simon Bennett, senior securities consultant at HSBC, for example, pointed to the notable difference between the US Securities and Exchange Commission’s (SEC) more aggressive stance towards the market and that of most European regulators.
Apart from Vandenreydt, who was obviously in favour of the idea of a reference data utility, there was also some scepticism about the idea of such a market infrastructure being established in a global context. Milne said that “for profit” operations tend to be more agile and able to keep up with industry developments and Bennett seconded this notion with regards to “massive infrastructure”. Kaye added that “agile” standards are needed if any progress is to be made and this could be achieved by different providers of data talking to each other in a common language or allowing for the same language to be used in multiple connections to these different providers.
Of course, this also indicates that Europe is somewhat out of step with the US in terms of the level of data standardisation discussions going on and regulatory action being taken. Dodd Frank may contain prescriptive references to the introduction of legal entity identification standards and instrument identifiers for the purposes of tracking systemic risk, but European directives have yet to be translated into domestic regulation and are not nearly as prescriptive at the outset.
Turning back to the ISITC discussions, however, Omgeo’s executive director of global sales and head of EMEA Leigh Walters said that standardisation of the standing settlement instruction (SSI) space is also closely tied to the trade confirmation process and that is why his firm has been focused on a joined up approach to these processes.
Milne noted that due to the “wafer thin” margins involved in trading, the dynamics of the cost/benefit analysis of carrying the costs of manual processes has changed significantly. This means that standardisation and automation are much more important than ever before for those active in the markets if they are to continue in a similar vein in future.
As well as reference data items, panellists also noted that there are many other hurdles to electronic trade confirmation such as cross border trading within Europe and outside of it, increased volumes, keeping pace with product innovation and legal hurdles within certain jurisdictions. Walters noted that “volume and inertia” are the main problem why electronic trade confirmation has not become industry standard, which is an argument that could equally be applied to the adoption of data standards.
Vandenreydt championed the idea of interoperability of standards as a hope for the future, but as noted by ISITC Europe CEO Graeme Austin, interoperability initiatives have a history of failure. It seems the operations community still needs convincing that reference data utilities and interoperability are the solution.