About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A Reference Data Review Reader Speaks Out About ECB’s Utility Proposals

Subscribe to our newsletter

The latter part of the FIMA 2009 conference in London this week was dominated by discussions among the delegation about reference data standardisation and the European Central Bank’s (ECB) proposals regarding a data utility. A number of Reference Data Review readers talked to us about their perspectives on how the utility idea could be harnessed for the good of the industry. One such reader elaborates further on why he thinks the proposals as they stand are based on a number of “myths” and recommends some alterations to ensure the “energy” behind them is put to good use.

The last day of FIMA featured a presentation by Francis Gross, head of the ECB’s external statistics division, on the vision for a “thin” reference data utility covering both instrument and entity data. This reader response is a direct reference to these proposals. The reader and data industry practitioner at a bank, who wishes to remain anonymous but is keen to provide useful feedback to the ECB, describes the utility initiative as “well intentioned”.

He understands the aim was to reduce costs across the financial services industry, and improve quality and consistency and provide reporting linkages, using an approach that has been successfully adopted by the automotive industry. However, he and a number of other readers have a few reservations about the research that has thus far gone into the initiative.

“The presentation was not well received at FIMA due to poor research and an approach based on myths and poorly informed assumptions,” says the reader. “The debate generated by this proposal could be a useful catalyst for standardisation where needed and appropriate to address genuine issues, however if implemented as proposed it will struggle to become useful and could very easily become another costly white elephant.”

The reader breaks his argument down into a list of myths and truths around the perceived problem that the utility is trying to solve:

Myth 1: Data vendors supply bad data and there is an endemic problem with data quality across the industry. This is based on evidence drawn from ECB’s link up with Fincore, who have built a multi-feed repository of instrument data in recent years for the central bank.

Truth 1: Different vendors have different strengths. Clients are attracted to the ones that have strengths that suit their businesses. In the ECB/Fincore reconciliation exercise they have incorrectly assumed that all vendors have equal service strengths and measured them accordingly.

Myth 2: Bad data is in the system. Banks are using ‘bad’ data without realising it and so the banking system is like a “central heating system about to explode”.

Truth 2: The banks have an acute understanding of data quality needs and have access to good quality, fit for purpose feeds and have systems and skilled staff to manage data quality levels. For example, if coupon rates were erroneous, banks would know this very well because their coupon accruals would need to be adjusted frequently.

Myth 3: All data needs to be standardised and the many variations that have proliferated are a result of poor control and governance.

Truth 3: Whilst this might hold true in some instances for certain types of data, in particular entity data, the proposed utility is instead aimed at instruments, which have very different drivers. Classifications are highly varied because they are used for different purposes for different investment strategies (this also holds true for regions). Common standards for classifications would be straightforward to define if all funds were index trackers. But investment managers can only differentiate their styles through the use of asymmetric data. To standardise this type of data would be detrimental to the industry and would fail; rather like trying to force all football teams to use identical team formations and tactics.

Myth 4: Instrument data quality issues are the underlying cause of trade processing STP issues.

Truth 4: Banks have been analysing STP ever since T+1 in 2001 and data quality is only very rarely the root cause of breaks. Instead, counterparty data is the primary data challenge in this area. For instruments it is data availability and timeliness from issuers and numbering agencies that could be improved, but this relates to latency and turnaround times (which need to be very fast) not standards and quality. In light of these myths and truths, the reader has a proposition to put to the ECB: “Putting ‘government’ command and control in place would add a further link in the chain that could result in delays, extra cost and inflexibility.

The aim of developing and adopting some basic core industry standards is absolutely the right problem to address but this does not require building a new utility. If ECB, JWG-IT and the EDM Council supported a standards setting initiative fully, and used their combined network and influence to agree full participation by financial institutions, it could achieve the cost saving and risk reporting benefits over time much more economically.” He adds: “Alternatively, the vendors could be required to work together on defining agreeing and supporting a number of core non proprietary standards to facilitate the same outcome.”

The reader contends that the ECB has made a number of assumptions about the industry, which were evident from its reaction to feedback:

Assumption 1: Negative market resistance is due to vested interests.

Truth 1: Vendors would embrace common non-proprietary standards, as defined and demanded by data consumers, if they expected to remain in this business. The design and introduction of common standards does not require a new utility and the entire related infrastructure build.

Assumption 2: Data managers within banks are not reporting to chief operating officer (COO) level in enough organisations, so lack influence to escalate the issue.

Truth 2: If ‘the issue’ were well articulated, based on genuine business risks, and coupled with a compelling proposition, this would hit the C level very quickly and would be actioned appropriately.

Assumption 3: Nobody has a brain big enough to understand this, best just to implement a utility and take it from there

Truth 3: Many of the most experienced and capable industry infrastructure experts are highly focused in this area and are operating within data vendors, IT vendors and financial institutions themselves. The ECB does not appear to have made sufficient attempt to elicit their feedback to date.

The reader is therefore keen for a number of “true” problems to be addressed in the industry: “There is extensive overlap and duplication of data supply, content and process across the industry that is wasteful and is ripe for rationalisation. There are many different versions of asset description formats and different levels of shares in issue, P/E ratios and a multitude of different descriptions for dates, countries and currencies, for example.”

He also contends that proprietary vended data that is “baked in” to business processes, creates inconsistencies and should be supplemented by or replaced by compatible industry standards. Hence, the European Commission’s investigation of Thomson Reuters’ RIC codes is likely to represent a positive step forward in his eyes. “Providing an exit route from proprietary stickiness is a key area. Another pivotal area where regulators could help reduce noise and duplication would be to outlaw licence restrictions imposed by some vendors over common data that require data to be deleted from systems as a condition of contract termination,” he says.

“Defining core standards will be painful and slow but needs top level commitment and the best people,” he continues. “Implementation will be expensive and best chance of success is to piggyback on existing IT spending plans, rather than attempt to gain funding for a big bang.”

He also believes, as indicated by a number of FIMA audience members earlier this week, the utility could only help to genuinely reduce the data quality validation burden and overhead by providing data quality liability guarantees.

The reader explains that data does not have the same focus in every institution: “Data is regarded as a problem by some organisations and as an asset by others. It is created and maintained for complex commercial reasons that need to be understood and considered.”

The biggest risk is therefore the misuse of data through ignorance and false assumptions. “The various ideas for data maturity models should address a deep understanding and awareness of the use of this data, as well as governance aspects,” he explains. The ECB can help to achieve a workable industry solution, but it needs to reassess its approach, according to the reader.

“The automotive industry analogy whereby financial services is comparatively in the dark ages does not hold true. It overlooks the diversity and complexity of the underlying business,” he says. The central bank therefore needs to come up with a different plan and the reader reckons that leveraging and sharing the “extensive wealth of information” gained from the Fincore and ECB initiative would be a good big step in the right direction.

Subscribe to our newsletter

Related content


Recorded Webinar: Potential and pitfalls of large language models and generative AI apps

Large language models (LLMs) and Generative AI applications are a hot topic in financial services, with vendors offering solutions, financial institutions adopting the technologies, and sceptics questioning their outcomes. That said, they are here to stay, and it may be that early adopters of Generative AI apps could gain not only operational benefits, but also...


BNP Paribas Becomes First EU G-SIB to Join GLEIF Validation Agent Programme

The Global Legal Entity Identifier Foundation (GLEIF) continues to build out the Global LEI System (GLEIS) with the addition of BNP Paribas as a Validation Agent. The addition of BNP Paribas marks the first global systemically important bank (G-SIB) headquartered in the EU to join the Validation Agent programme. Most recently, the GLEIF added Nord...


Data Management Summit London

Now in its 14th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.


Pricing and Valuations

This special report accompanies a webinar we held a webinar on the popular topic of Pricing and Valuations, discussing issues such as transparency of pricing and how to ensure data quality. You can register here to get immediate access to the Special Report.