About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

How to Maximise Datasets Created by MiFID II

Subscribe to our newsletter

MiFID II generates about three trillion new data points, begging the question of how financial institutions will maximise their use of new data sources created by the regulation. But how useful is the data six months into MiFID II, what challenges does it present, and will there be winners and losers among firms that can and can’t grasp the data and run with it?

The answers to these questions and more were discussed during a recent A-Team Group webinar that was moderated by A-Team editor, Sarah Underwood, and joined by Gaurav Bansal, director at RCloud Consulting and former MiFID II programme manager; Alex Wolcough, director at Appsbroker; and John Mason, Global Head of Regulatory & Market Structure Strategic Response and Propositions at Thomson Reuters.

Setting the scene for the webinar, an audience poll asking to what purpose organisations are using datasets created by MiFID II showed some 47% of respondents using the data to develop business opportunities, 38% to identify business opportunities, 34% to gain competitive edge and 28% purely for compliance. A further 28% said they are considering how to use the data.

The webinar speakers noted that in their experience firms were moving beyond compliance to consider MiFID II data, particularly pre-trade data, for business purposes, although they also pointed out that these are early days in MiFID II implementation and scepticism remains about the quality of new data and how useful it is today.

Indeed, considering all the new data points, reference data fields, ISINs for OTC derivatives, and market data published by new trading venues and reporting mechanisms established by MiFID II, the data management challenges of using newly created data sources and datasets are many and varied. A second audience poll highlighted getting hold of the data and integrating it as the toughest tasks, ahead of poor data quality, poor data consistency and understanding the data.

Bansal noted problems of collecting, storing and managing the huge volumes of data generated by MiFID II, as well as reconciliation issues. Mason said challenges in the early days of fundamental change were not surprising and suggested firms struggling to source and manage new datasets could use aggregators such as Thomson Reuters.

Wolcough discussed the issues caused by Approved Publication Arrangements (APAs) charging fees at different rates for the data they publish in the 15 minutes before it is supposed to be free. He noted that large firms with deep pockets can afford the date, but small firms may not be able to, a problem that could cause winners and losers in capital financial markets.

With the challenges mastered, the speakers discussed how firms could maximise use of MiFID II datasets. Bansal talked about how combining more client and product data with data from trade execution venues could provide a powerful source of information for purposes such as risk modelling and better client outcomes. Mason noted the need to take data out of siloes and integrate it to maximise the potential of analytics across client, product and trade execution data, and link the data to other information such as news to develop more holistic trading strategies.

The benefits of MiFID II datasets? Significant for both business and operations according to a final audience poll. With the caveat of improved data quality, the speakers agreed, noting clear operational benefits, improved customer service, and the ability to apply emerging technologies such as robotic process automation and artificial intelligence to the data to achieve greater efficiencies and deliver deeper insight into customer behaviour and market activity.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practice approaches to data management for regulatory reporting

Effective regulatory reporting requires firms to manage vast amounts of data across multiple systems, regions, and regulatory jurisdictions. With increasing scrutiny from regulators and the rising complexity of financial instruments, the need for a streamlined and strategic approach to data management has never been greater. Financial institutions must ensure accuracy, consistency, and timeliness in their...

BLOG

Behavox’s Case for Explainable AI in Compliance – Trust, not Magic

When compliance teams hear “AI,” enthusiasm is often tempered by unease. The promise of automation is tempered by the reality of black-box models, hallucinations, privacy risks, and the nagging question of whether the technology will meet regulators expectations. Behavox, a Montreal- and London-based RegTech, is seeking to bridge that trust gap with its latest release:...

EVENT

Eagle Alpha Alternative Data Conference, New York, hosted by A-Team Group

Now in its 8th year, the Eagle Alpha Alternative Data Conference managed by A-Team Group, is the premier content forum and networking event for investment firms and hedge funds.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...