About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Two Months to MiFID II: Are Financial Institutions at Risk of Eating Their Data Soup with a Fork?

Subscribe to our newsletter

By: Roy Kirby, senior product manager, SIX Financial Information

Markets in Financial Instruments Directive II (MiFID II) is just around the corner, and one issue that may have escaped firms’ attention is the practical consideration of how to exchange data with other market players. Come January, manufacturers will be faced with the challenge of working out how to send data and documents to their retailers and how to receive sales information outside their target market.

In facing this challenge, firms need to look at the whole data picture – from the regulator’s technical standards down to the business and IT requirements. Data needs to be efficient, structured, ready in time and standardised across the entirety of the business. Failure to do this could result in an inability to supply data to the market, which in turn could mean that manufacturers may not be able to sell any products.

A problem for manufacturers

Why is this a problem for manufacturers? Simply put, they need to be able to get their products to market in compliance with MiFID II. To do this, connectivity, automated data exchange and automated data processing are required. If their distribution networks can’t easily and efficiently access the data they need, they risk compromising what they can offer to customers.

Unfortunately, many institutions may have underestimated the practical complexities of standardising data and documents in their distribution models. The industry has already made some moves towards standardisation, with several open templates designed to facilitate MiFID II information exchange materialising on the market over the summer. However, these are only the first steps on the road to distribution.

For firms that put one of these templates in place, they will then need to ‘translate’ it so it can speak the in-house IT language and fit into existing technical structures. This is especially important in terms of IT systems and data management rules. A template provides a useful outline, however, firms typically handle many basic data management questions completely differently. For example: are empty fields classed as empty or zero values? Are values separated by dots or commas? Is the in-house standard CSV or XML files?

Once these questions are taken into account, some element of customisation becomes a key factor in getting a workable system in place – but as this means building and configuring a proprietary connection for every retailer who carries a manufacturer’s products, the volume of work required may escalate quickly.

The need for metadata

To complicate things further, once the data is mapped, it needs to be overlaid with a layer of metadata that follows the product to the point of sale, such as timestamps that can be archived for audit and, perhaps, prove a Key Information Document (KID) was provided before a sale took place. Target market and product suitability requirements also need to be aligned with entitlement metadata, so the right people receive the right data and documents in time, keeping product governance compliant.

By approaching this problem separately for each distributor in their retail network, manufacturers may find themselves trying to eat a bowl of soup with a fork. The same applies to how retailers approach the data surrounding each product on their shelf. Finding ways to normalise the information at an industry scale, so it can be disseminated effectively without overwhelming cost burden and data management complexity, could prove to be the proverbial spoon. Manufacturers, distributors and their partners need to start talking to each other about how they will exchange data in a standardised, automated way.

It’s understandable that this has slipped under the radar for some manufacturers in the run up to MiFID II. They have been worrying about building systems to deal with generating KIDs for Packaged Retail Investment and Insurance-based Investment Products (PRIIPs) regulation or MiFID II data calculations. Firms should start thinking holistically about IT specifications and data mapping capabilities throughout their distribution chain, or face up to the opportunity cost of making it more difficult for retailers to sell their products.

If manufacturers can’t reach their markets with the necessary data and documents, it will be harder for the marketplace to get hold of their products. This means that, come MiFID II, consumer choice could end up being limited. This benefits no one, causing a reduction in service for both customers and investors alike.

However, in the midst of the scramble for regulatory preparedness, opportunities remain. Companies that address this issue in time may be able to enlarge their distribution footprint, giving them a competitive edge, while others struggle with the new requirements. Despite uncertainty, one thing remains clear: the MiFID II data challenge does not end when the January go-live begins.

Subscribe to our newsletter

Related content


Recorded Webinar: How to automate entity data management and due diligence to ensure efficiency, accuracy and compliance

Requesting, gathering, analysing and monitoring customer, vendor and partner entity data is time consuming, and often a tedious manual process. This can slow down customer relationships and expose financial institutions to risk from inaccurate, incomplete or outdated data – but there are solutions to these problems. This webinar will consider the challenges of sourcing and...


EDM Council Introduces Data Excellence Program

The EDM Council has introduced a Data Excellence Program offering standardised measurement and recognition of data management excellence at the organisational level. The initiative aims to acknowledge organisations that are dedicated to continuous improvement and excellence in data management based on globally recognised best practices. Key elements of the Data Excellence Program include: Data management...


Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.


Hosted/Managed Services

The on-site data management model is broken. Resources have been squeezed to breaking point. The industry needs a new operating model if it is truly to do more with less. Can hosted/managed services provide the answer? Can the marketplace really create and maintain a utility-based approach to reference data management? And if so, how can...