The market fragmentation that has been a major by-product of the introduction of MiFID has resulted in a number of serious data related issues in the OTC space in particular, said Tom Davin, managing director of the Washington-based Financial Information Services Division (FISD) of the Software & Information Industry Association (SIIA), at the recent MiFID JWG reunion meeting. Regulators are therefore tackling the post-trade space by attempting to introduce a new focus on data quality and timeliness, but the current set of proposed data standards for instrument identification may not be sufficient, suggested Davin.
The focus of the regulatory gaze should rightly be on the OTC space with regards to data quality rather than the regulated venues, said Davin. However, he has some reservations about the regulators’ suggested method of identification for individual securities. The focus on making ISO standards mandatory and using an instrument’s ISIN, currency and trading venue data for individual identification may not be enough, he warned.
“FISD is supportive of the introduction of unique identifiers in order to be able to deal with trade cancellations and corrections but there may be limitations to these metrics,” said Davin. He asked attendees to the meeting to consider any other metrics that could be used for these purposes and feed them back to industry associations and the regulators themselves.
Davin also discussed the challenges inherent in the attempt to consolidate European market data into a consolidated tape. “Data consolidation already happens in this respect at a firm or vendor level, but the idea of establishing a consolidated tape across the industry needs to be carefully considered,” he said.
The Committee of European Securities Regulators (CESR) has two options with regards to establishing such a source for consolidated trade data. It could create approved publishing arrangements and certify some of the vendor third parties in the market to publish data to the market. CESR would then need to maintain a list of these approved vendors. “It would also have to set certain standards for data quality and consistency and it would have to ensure that each trade is only published once if it goes down this route,” explained Davin.
The cost and competitive implications of such a move should also be considered, as the move would likely require the publication of some level of free data. Davin noted that it would also split pre and post-trade data to some extent, but that the industry is seemingly more receptive to the idea of approved publishing arrangements than before.
Alternatively, Europe could see the development of a US style consolidated tape that would aim to be run at cost. However, Davin is concerned that there are a number of unanswered questions around who would run the facility, how it would operate and how much it would cost to set up. “It may be premature to introduce a consolidated tape and the US market experience should be carefully examined in this respect. The latency issues and operational constraints of the US version should be explored. We need to be sure that we need it before we introduce it,” he warned.
Subscribe to our newsletter