Yesterday’s panel session on a reference data utility was surprisingly well attended, given the fact it was last thing on the first day of the conference and drinks were being served outside on the exhibition floor. Testament to the importance of the subject in an environment where everybody is waiting for the regulator’s hammer to fall, delegates listened attentively to panellists debating the practicalities of establishing such a utility for instrument and entity data.
Reprising his usual argument for a “thin” utility, the European Central Bank’s head of external statistics Francis Gross championed the cause alongside EDM Council managing director Mike Atkin. Both discussed the potential of a role for Swift and the ISO process in the endeavour, as a market neutral body that could represent its industry demographic. Perhaps next year, as well as having a main session on reference data, Swift will also have a speaker on the panel itself? Given its push in the legal entity identification space with the Bank Identifier Code (BIC), such representation would make sense.
Keen to distance these efforts from the failure of the International Business Entity Identifier (IBEI) several years ago, Atkin contended that a regulatory mandate would overrule the need for a business case to convince the industry to adopt entity identification standards.
However, Northern Trust’s senior vice president of operations and technology Kay Vicino warned that such regulation could also pose a potential threat to the industry if the impacts are not carefully considered beforehand. “Financial institutions really need to get engaged in the process in order to feed back to the regulators and ensure there is some level of clarity around exactly what is required in terms of reporting to the Office of Financial Research, for example,” she said.
Vicino stressed the potential costs of mapping to a whole host of new regulatory imposed entity and instrument identifiers. “We absolutely do not want to have lots of different reference data utilities to report to and pull data from, there should only be one. If there is more than one, the costs could be unsustainable,” she said.
But this also raises a contentious issue: where should such a utility be based? Would Europe be happy with a US-based utility? What are the legal implications of data transfer across jurisdictions?
Unsurprisingly therefore, issues such as data privacy and liability cropped up during the debate. Meredith Gibson, Citi director and legal counsel, suggested that data sharing would open up a legislative can of worms if these issues were not dealt with at the outset.
The topic of systemic risk itself and the seemingly fluid definition of what it constitutes is another point of concern for the industry, agreed the practitioner panellists. The degree of uncertainty with regards to regulation around monitoring systemic risk is tied up with the concept of a utility in many minds and firms are rightly concerned about the future. This adds the structure and operation of such a utility to the long list of regulatory unknowns in the pipeline.
Vicino elaborated upon a wish list for such a utility for the regulator to deliver on: “It needs to provide global consistency and clarity. It needs to be pragmatic, so that the industry does not have to do a lot of duplicative work. Regulators also need to engage the industry at an early stage to understand the full impact and requirements.”
SIX Telekurs’ head of marketing Ivo Bieri added that, potentially, a vendor could be the facilitator of such a utility. Given DTCC is already rumoured to be in talks with the powers that be about the Office of Financial Research, he may just be right.
Subscribe to our newsletter