About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

ECB’s Planned Reference Data Utility is Both Desirable and Plausible, Says EDM Council’s Atkin

Subscribe to our newsletter

Earlier this year, the European Central Bank (ECB) indicated it is seriously considering getting into the reference data world via the establishment of a securities reference data utility. Since the announcement, the idea of a central bank led utility has proved to be a divisive subject for many. However, as Mike Atkin, managing director of the EDM Council, explains, the ECB is keen to act as a catalyst for change and will be working with, rather than against, the community to get the idea off the ground.

It was evident at last year’s FIMA conference in London in November that the seeds of the idea for a reference data utility were germinating. The ECB and its technology partner Finsoft explained the benefits of its Centralised Securities Database (CSDB) to the delegation and even suggested that it could be used as the basis for a business entity identification repository in the future. After all, the CSDB is fairly comprehensive: it covers market data, instrument data, entity data, corporate actions and a comprehensive data dictionary, and encompasses 5.1 million individual records.

“A few years ago, the ECB recognised the problems of data integrity and comparability and built state of the art securities reference and legal entity databases to support their objectives,” explains Atkin. The ECB (like all central banks) does a lot of statistical analysis in support of its role in developing monetary policy, analysing event driven policy requirements, providing near term oversight of complexity and analysing systemic risk. In essence, financial information, or reference data, is needed to help the central bank drill down and link macro to specific micro issues, hence the development of the CSDB.

The ECB’s recognition of the importance of precise and comparable data thus led it to the concept of the reference data utility, adds Atkin. “They are acting as a catalyst because they understand the relationship. This is a simple notion to understand: trust and confidence in the underlying data starts with consistent semantics and unique legal entity identification. This is about standards. Once the standards are in place, the regulators, legislators and market authorities are in position to implement law to compel issuers to mark up their issuance documents using standard semantics and post them in a central repository,” he explains.

Consistent semantics, in Atkin’s view, relate to a clear agreed definition of the legal and contractual structure of financial instruments. Moreover, unique legal entity identification stands as the basis for linking issues to issuers to programmes, as well as the key to performing “single name exposure” analysis. The standardisation of this data via a repository means that the industry can have access to consistent data from the source, he contends. “Consistent data is an essential factor of input into complex analysis,” he adds.

Atkin believes the creation of such a facility will benefit every corner of the market: “For financial institutions the benefit is higher quality data and reduced costs. For vendors the benefit is lower costs of collection and maintenance. For regulators the benefit is trust and confidence in the core factors of input into systemic analysis.”

The decision by the ECB to enter this space is largely being driven by concerns over risk management and its ability to provide oversight over the complex financial services industry, says Atkin. In spite of the naysayers, he has been a vocal advocate of the plan in recent months and has been focused on convincing the industry of the benefits of a central repository.

“It is absolutely plausible. The ECB is only acting as a catalyst in getting the utility concept understood and implemented. They, along with their colleagues in other major markets, are in position to push this concept forward. There are not that many obstacles to implementation. The standards are well along the path of development and just need a push to get them implemented. The central banks are going to be major participants in providing systemic oversight and are in position to support legislation (on issuers) and coordinate among those that can do implementation,” he explains.

Although it appeared at first glance that the ECB was planning to run the utility itself, Atkin assures that this is not the case. “The only ambition of the ECB is as a consumer of data. They understand the underlying relationships and are in position to advocate improvements in the chain of supply in this industry. The prime directive is to deliver quality data that users trust and have confidence in to be fit for purpose for modelling and analysis. That’s their aim,” he says.

Atkin reckons the regulators and legislators are precisely the right entities to move the objective forward, but indicates that the functions must be separated. “Legislators have the power to implement law. Without legal and regulatory compulsion, I fear the industry will not be able to overcome the difficulties unravelling and reconnecting systems, processes and organisational environments that are required to fix the data dilemma,” he contends.

The EDM Council has long been campaigning for recognition within the industry of the value of the enterprise data management endeavour. Much like this repository plan, Atkin has blamed a lack of focus within the industry on the importance of data on what he calls “the evils of short term orientation, functionally myopic views and the very real challenges of organisational alignment”. The idea that regulatory compulsion is needed to force change has also been on his agenda for some time, but is unlikely to go down well with all corners of the market.

The second issue is implementation. “Neither the ECB nor any other regulator is positioned to manage implementation, nor do they want to – but they are in position to oversee implementation on behalf of the global industry,” contends Atkin. Therefore although they won’t be running the utility themselves (a vendor partner will likely be chosen in the coming months, if they haven’t been already), the ECB will be in charge of overseeing progress.

This is another area where the some areas of the securities industry may not quite see eye to eye with Atkin and the ECB. Exactly to what extent will the central bank be involved in the project? On what criteria will the vendor be chosen? Is a profit driven model more likely to succeed? These are all questions that are likely to be posed to the central bank in the coming months and, hopefully, it will be able to come up with satisfactory answers…

In the meantime, Atkin is keen to stress that parties such as vendors should not feel threatened by the move. “Reference data is a necessary, mandatory, evil. It’s analogous to the tires on a car. No one buys a car for the tires, but you can’t sell cars without quality tires. Standardise the tires and let the industry compete on features and functionality. In my opinion this is a net gain for vendors. It reduces their costs of reference data collection and maintenance,” he says.

However, he is also aware of the bumps on the road ahead: “The biggest challenge is to educate legislators, regulators and market authorities on the importance of data management. Precise data content is the lifeblood of the financial industry. It is the raw material that feeds every trade, every client interaction, every business process and every form of analysis. As we all know, data content management is not an area that is well understood – so making the data connection to the lawmakers is the key.”

Subscribe to our newsletter

Related content


Recorded Webinar: How to optimise SaaS data management solutions

Software-as-a-Service (SaaS) data management solutions go hand-in-hand with cloud technology, delivering not only SaaS benefits of agility, a reduced on-premise footprint and access to third-party expertise, but also the fast data delivery, productivity and efficiency gains provided by the cloud. This webinar will focus on the essentials of SaaS data management, including practical guidance on...


Duco Acquires Unstructured Data Management Specialist Metamaze

Duco, a provider of SaaS AI-powered data automation, has acquired Metamaze, an Antwerp, Belgium-based company offering an AI-driven intelligent document processing SaaS platform that automatically processes, extracts and interprets information from any type of unstructured document. By combining the Metamaze and Duco platforms, customers can ingest any type of data from any type of document...


TradingTech Briefing New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.


Practical Applications of the Global LEI – Client On-Boarding and Beyond

The time for talking is over. The time for action is now. A bit melodramatic, perhaps, but given last month’s official launch of the global legal entity identifier (LEI) standard, practitioners are rolling up their sleeves and getting on with figuring out how to incorporate the new identifier into their customer and entity data infrastructures....