About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Barrage of New Regulation Could Cause Death by Data Drowning, Warns ISMAG’s Gubert

Subscribe to our newsletter

The slew of new regulations entering into force over the next few years, including a new iteration of MiFID, the multiple legislative changes resulting from the Dodd-Frank Act and the Securities Law Directive (to name just a few), will result in such an increase in reporting requirements that firms may find themselves in danger of death by data drowning, according to John Gubert, chairman of the International Securities Market Advisory Group (ISMAG). Speaking during an Omgeo sponsored webinar, Gubert noted that even regulators will have difficulty sifting through and making sense of the data it is asking firms to provide for risk monitoring purposes.

The benefits of new regulation may be an increase in standardisation and transparency, but at what cost? Gubert highlighted three areas of particular concern within the regulatory arena that he believes may pose serious problems in the future: the increased complexity of the regulation itself; the volume of data that must be processed and reported to the regulator; and the increased potential for regulatory arbitrage.

The complexity of regulatory requirements is such that the reporting process and the underlying data management could become incredibly challenging. Regulators are becoming increasingly prescriptive about data requirements for items contained within regulatory reports, for example, and this is resulting in what Gubert calls a “huge change management process”; all at a time when pressures on cost are severe.

The data that must be produced to satisfy the regulators is held in multiple pools and multiple formats and is therefore not easy to standardise. However, this could also be said of the manner in which national regulators store and maintain this data. Gubert questioned the ability of the regulatory community to process all of this raw data in the manner required to accurately track risk across the market; regulators could also be at risk of death by data drowning.

The lack of a harmonised approach to the market from a global standpoint will therefore further exacerbate this problem due to the spectre of regulatory arbitrage. Conflicts between the new pan-European regulatory bodies and national regulators over their approach to data is also highly likely, given that transaction reporting under MiFID (mark one) is performed inconsistently across Europe, for example.

Although it wasn’t discussed during the webinar, the Office of Financial Research is likely to become a proving ground for regulatory cooperation in the reference data space. It will be interesting to see how the rest of the world reacts to an initiative being led by one national regulator in this space. The fact that many aspects of the Dodd-Frank Act are not directly in synch with European legislation (see the different timescales for reporting to OTC derivatives repositories, for example, real-time versus one business day) does not bode well in the meantime.

Turning back to Gubert, he noted that many important aspects of risk in the market are, in fact, also being largely ignored by the regulators, including the risks involved in corporate actions processing. He noted the potential dangers involved in separating out settlement and asset servicing via the introduction of the European Central Bank’s (ECB) Target2-Securities (T2S) settlement system have been overlooked by many in the regulatory community. Centralising settlement but retaining a fragmented environment for corporate actions processing could introduce more risk into the system in Europe, he warned.

Certainly, the impact of T2S on this space is a concern within the industry and the Corporate Actions Sub-group (CASG) for T2S is discussing some of these issues to feed back to the ECB as part of the overall project.

Gubert, however, is wary of the future as a result of the complex nature of this regulation. He indicated that he is expecting to see a range of potentially “very large fines” being meted out by the regulatory community over the next few years as a result. Given what has gone on over recent years, he may well be right.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to optimise the business value of your data using agile data governance

Data governance is transforming from a risk management and compliance tool with limited and prescriptive controls, to a solution that can help you optimise the business value of your data. In this role, data governance must scale to manage rising volumes of data, more and different data types, and changing user requirements, while continuing to...

BLOG

Pico Launches Corvil Electronic Trading Data Warehouse for Real-Time Execution Analytics

Trading technology and infrastructure specialist Pico has launched the Corvil Electronic Trading Data Warehouse, which aims to provide visibility into transaction execution quality to correlate client trading behaviour with execution path and counterparty performance. The solution, offered as a standalone software product, streams nanosecond-timestamped data from Corvil network instrumentation, to deliver real-time visibility at a...

EVENT

RegTech Summit New York

Now in its 6th year, the RegTech Summit in New York will bring together the regtech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...