About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

How to Maximise Datasets Created by MiFID II

Subscribe to our newsletter

MiFID II generates about three trillion new data points, begging the question of how financial institutions will maximise their use of new data sources created by the regulation. But how useful is the data six months into MiFID II, what challenges does it present, and will there be winners and losers among firms that can and can’t grasp the data and run with it?

The answers to these questions and more were discussed during a recent A-Team Group webinar that was moderated by A-Team editor, Sarah Underwood, and joined by Gaurav Bansal, director at RCloud Consulting and former MiFID II programme manager; Alex Wolcough, director at Appsbroker; and John Mason, Global Head of Regulatory & Market Structure Strategic Response and Propositions at Thomson Reuters.

Setting the scene for the webinar, an audience poll asking to what purpose organisations are using datasets created by MiFID II showed some 47% of respondents using the data to develop business opportunities, 38% to identify business opportunities, 34% to gain competitive edge and 28% purely for compliance. A further 28% said they are considering how to use the data.

The webinar speakers noted that in their experience firms were moving beyond compliance to consider MiFID II data, particularly pre-trade data, for business purposes, although they also pointed out that these are early days in MiFID II implementation and scepticism remains about the quality of new data and how useful it is today.

Indeed, considering all the new data points, reference data fields, ISINs for OTC derivatives, and market data published by new trading venues and reporting mechanisms established by MiFID II, the data management challenges of using newly created data sources and datasets are many and varied. A second audience poll highlighted getting hold of the data and integrating it as the toughest tasks, ahead of poor data quality, poor data consistency and understanding the data.

Bansal noted problems of collecting, storing and managing the huge volumes of data generated by MiFID II, as well as reconciliation issues. Mason said challenges in the early days of fundamental change were not surprising and suggested firms struggling to source and manage new datasets could use aggregators such as Thomson Reuters.

Wolcough discussed the issues caused by Approved Publication Arrangements (APAs) charging fees at different rates for the data they publish in the 15 minutes before it is supposed to be free. He noted that large firms with deep pockets can afford the date, but small firms may not be able to, a problem that could cause winners and losers in capital financial markets.

With the challenges mastered, the speakers discussed how firms could maximise use of MiFID II datasets. Bansal talked about how combining more client and product data with data from trade execution venues could provide a powerful source of information for purposes such as risk modelling and better client outcomes. Mason noted the need to take data out of siloes and integrate it to maximise the potential of analytics across client, product and trade execution data, and link the data to other information such as news to develop more holistic trading strategies.

The benefits of MiFID II datasets? Significant for both business and operations according to a final audience poll. With the caveat of improved data quality, the speakers agreed, noting clear operational benefits, improved customer service, and the ability to apply emerging technologies such as robotic process automation and artificial intelligence to the data to achieve greater efficiencies and deliver deeper insight into customer behaviour and market activity.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Streamlining trading and investment processes with data standards and identifiers

3 June 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Financial institutions are integrating not only greater volumes of data for use across their organisation but also more varieties of data. As well, that data is being applied to more use cases than ever before, especially regulatory compliance and ESG integration....

BLOG

Accelex Says its AI Agent Can Tame The ‘Nastiest’ Corner of Data Management

Unstructured data is the bugbear of private market data managers, but one of the latest entrant to the space claims to have a solution that can tame the “nastiest” part of data retrieval – private market fund quarterly reports. Accelex is a London-based artificial intelligence (AI) focussed FinTech that has launched a platform that can prise market-relevant...

EVENT

TradingTech Briefing New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...