The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

ECB’s Planned Reference Data Utility is Both Desirable and Plausible, Says EDM Council’s Atkin

Share article

Earlier this year, the European Central Bank (ECB) indicated it is seriously considering getting into the reference data world via the establishment of a securities reference data utility. Since the announcement, the idea of a central bank led utility has proved to be a divisive subject for many. However, as Mike Atkin, managing director of the EDM Council, explains, the ECB is keen to act as a catalyst for change and will be working with, rather than against, the community to get the idea off the ground.

It was evident at last year’s FIMA conference in London in November that the seeds of the idea for a reference data utility were germinating. The ECB and its technology partner Finsoft explained the benefits of its Centralised Securities Database (CSDB) to the delegation and even suggested that it could be used as the basis for a business entity identification repository in the future. After all, the CSDB is fairly comprehensive: it covers market data, instrument data, entity data, corporate actions and a comprehensive data dictionary, and encompasses 5.1 million individual records.

“A few years ago, the ECB recognised the problems of data integrity and comparability and built state of the art securities reference and legal entity databases to support their objectives,” explains Atkin. The ECB (like all central banks) does a lot of statistical analysis in support of its role in developing monetary policy, analysing event driven policy requirements, providing near term oversight of complexity and analysing systemic risk. In essence, financial information, or reference data, is needed to help the central bank drill down and link macro to specific micro issues, hence the development of the CSDB.

The ECB’s recognition of the importance of precise and comparable data thus led it to the concept of the reference data utility, adds Atkin. “They are acting as a catalyst because they understand the relationship. This is a simple notion to understand: trust and confidence in the underlying data starts with consistent semantics and unique legal entity identification. This is about standards. Once the standards are in place, the regulators, legislators and market authorities are in position to implement law to compel issuers to mark up their issuance documents using standard semantics and post them in a central repository,” he explains.

Consistent semantics, in Atkin’s view, relate to a clear agreed definition of the legal and contractual structure of financial instruments. Moreover, unique legal entity identification stands as the basis for linking issues to issuers to programmes, as well as the key to performing “single name exposure” analysis. The standardisation of this data via a repository means that the industry can have access to consistent data from the source, he contends. “Consistent data is an essential factor of input into complex analysis,” he adds.

Atkin believes the creation of such a facility will benefit every corner of the market: “For financial institutions the benefit is higher quality data and reduced costs. For vendors the benefit is lower costs of collection and maintenance. For regulators the benefit is trust and confidence in the core factors of input into systemic analysis.”

The decision by the ECB to enter this space is largely being driven by concerns over risk management and its ability to provide oversight over the complex financial services industry, says Atkin. In spite of the naysayers, he has been a vocal advocate of the plan in recent months and has been focused on convincing the industry of the benefits of a central repository.

“It is absolutely plausible. The ECB is only acting as a catalyst in getting the utility concept understood and implemented. They, along with their colleagues in other major markets, are in position to push this concept forward. There are not that many obstacles to implementation. The standards are well along the path of development and just need a push to get them implemented. The central banks are going to be major participants in providing systemic oversight and are in position to support legislation (on issuers) and coordinate among those that can do implementation,” he explains.

Although it appeared at first glance that the ECB was planning to run the utility itself, Atkin assures that this is not the case. “The only ambition of the ECB is as a consumer of data. They understand the underlying relationships and are in position to advocate improvements in the chain of supply in this industry. The prime directive is to deliver quality data that users trust and have confidence in to be fit for purpose for modelling and analysis. That’s their aim,” he says.

Atkin reckons the regulators and legislators are precisely the right entities to move the objective forward, but indicates that the functions must be separated. “Legislators have the power to implement law. Without legal and regulatory compulsion, I fear the industry will not be able to overcome the difficulties unravelling and reconnecting systems, processes and organisational environments that are required to fix the data dilemma,” he contends.

The EDM Council has long been campaigning for recognition within the industry of the value of the enterprise data management endeavour. Much like this repository plan, Atkin has blamed a lack of focus within the industry on the importance of data on what he calls “the evils of short term orientation, functionally myopic views and the very real challenges of organisational alignment”. The idea that regulatory compulsion is needed to force change has also been on his agenda for some time, but is unlikely to go down well with all corners of the market.

The second issue is implementation. “Neither the ECB nor any other regulator is positioned to manage implementation, nor do they want to – but they are in position to oversee implementation on behalf of the global industry,” contends Atkin. Therefore although they won’t be running the utility themselves (a vendor partner will likely be chosen in the coming months, if they haven’t been already), the ECB will be in charge of overseeing progress.

This is another area where the some areas of the securities industry may not quite see eye to eye with Atkin and the ECB. Exactly to what extent will the central bank be involved in the project? On what criteria will the vendor be chosen? Is a profit driven model more likely to succeed? These are all questions that are likely to be posed to the central bank in the coming months and, hopefully, it will be able to come up with satisfactory answers…

In the meantime, Atkin is keen to stress that parties such as vendors should not feel threatened by the move. “Reference data is a necessary, mandatory, evil. It’s analogous to the tires on a car. No one buys a car for the tires, but you can’t sell cars without quality tires. Standardise the tires and let the industry compete on features and functionality. In my opinion this is a net gain for vendors. It reduces their costs of reference data collection and maintenance,” he says.

However, he is also aware of the bumps on the road ahead: “The biggest challenge is to educate legislators, regulators and market authorities on the importance of data management. Precise data content is the lifeblood of the financial industry. It is the raw material that feeds every trade, every client interaction, every business process and every form of analysis. As we all know, data content management is not an area that is well understood – so making the data connection to the lawmakers is the key.”

Related content

WEBINAR

Recorded Webinar: Adopting Entity Data Hierarchies to Address Holistic Risk Management

Firms across the board are struggling to gain a comprehensive view of their counterparty risk. In the wake of the Credit Crisis, regulators have increased their focus on pushing firms to not only better understand risk exposure, but also be able to provide evidence of the analysis they use to create their view of risk....

BLOG

Fenergo Launches New Client Lifecycle Management Solution

Digital transformation specialist Fenergo has streamlined its existing client lifecycle management (CLM) solution to make it more accessible for mid-tier and boutique businesses. Fen-Xcelerate, powered by Amazon Web Services (AWS) is a lower cost, cloud-based version of the firm’s flagship CLM product. Integrated with partners including Salesforce, Refinitiv’s World-Check One, RDC, and DocuSign, the solution also supports integration with...

EVENT

Data Management Summit London

Now in its 10th year, the Data Management Summit (DMS) in London explores how financial institutions are shifting from defensive to offensive data management strategies, to improve operational efficiency and revenue enhancing opportunities. We’ll be putting the business lens on data and deep diving into the data management capabilities needed to deliver on business outcomes.

GUIDE

Entity Data Management Handbook – Sixth Edition

High-profile and punitive penalties handed out to large financial institutions for non-compliance with Anti-Money Laundering (AML) and Know Your Customer (KYC) regulations have catapulted entity data management up the business agenda. So, too, have industry and government reports on the staggering sums of money laundered on a global basis. Less apparent, but equally important, are...