About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

EDM Releases Progress Update on Faulty Data Pilot and Details of Semantic Repository Project

Subscribe to our newsletter

It has been a busy month for the EDM Council, not only has it released an interim report with IBM on the findings of its project to measure the performance implications of missing or faulty data on key functions within financial institutions, it has also released details of plans for a new online semantic repository.

The report highlights the results of the data variance pilot, which has been designed to measure the data quality of 42 data elements needed for trade confirmation and has so far involved 51 fixed income securities across nine financial institutions.

“Our project with IBM is part of our core metrics initiative and it is one of the original objectives that the EDM Council was set up to achieve,” explains Mike Atkin, managing director of the EDM Council. “When the EDM Council was set up we talked about finding out the ROI and impact on an institution of investing in EDM. We have since organised metrics to measure the impact of EDM on institutions and the interim report is an update on that.”

The data elements studied in the report were drawn from interviews with 25 of the EDM Council’s member institutions, which were each asked to pinpoint the data attributes that were critical to the day to day running of their business.

“We came up with key performance indicators to measure the value of EDM. We conducted a data variance pilot with a number of firms to see where there was variance in terms of data across the institution. The data elements we looked at were supposed to be the same across the institution but we saw 20 to 30 per cent variance in those elements in our pilot. We saw the data gap that exists and the next step was to quantify the impact of the bad data on various business processes,” Atkin elaborates.

This pilot is based on the EDM Council’s analysis of data flows concerning nine core data attributes that feed calculations and may cause problems downstream: coupon rate, current factor, dated date, day count convention (method), first coupon date, maturity date, next call date, rating and valuation. Participants in the pilot were asked to provide values from their securities master database for analysis.

According to the report, the results confirmed a significant level of inconsistency, which varied as much as 30 per cent, on a number of critical data attributes. For example, there were 20 instances of coupon differences, 81 mismatches for issue date, plus many other elements where the discrepancy rate was initially expected to be much lower.

As a result of the findings, the group has decided to expand the size of the pilot group with regards to the number of firms involved, the instruments and types of security covered, and beyond the original 41 data elements measured. It is also looking at analysing the root causes of this variance and the potential costs of inconsistent data. “By using test scenarios you can actually calculate your value at risk and gain a quantified perspective on the impact of inaccurate data,” says Atkin.

As well as this ongoing pilot, the EDM Council has just released the details concerning the first draft of its semantic repository. The repository will identify all data attribute terms and definitions used by financial institutions for business processing, investment decisions, scenario modelling, risk management and portfolio valuation. The initial draft of the repository is due to be released at the end of June and it aims to tackle the problem of ambiguity within data terms used in the financial markets.

“We are talking about achieving consistency in the terms, meanings and definitions in the context of business reality for the data that sits in everybody’s master files. All we are talking about is really a dictionary of data terms in the market,” explains Atkin.

The EDM Council believes that the repository will reduce the amount of manual reconciliation required and streamline institutions’ business processes, as well as making it easier to create consistent benchmarks and perform risk analysis.

The project is under the direction of Mike Bennett, director of consultancy firm Hypercube, and has been designed as a collaborative publishing activity in the vein of internet encyclopaedia Wikipedia. It will therefore enable all interested participants to edit or add to an entry on their subjects of expertise.

It will be hosted on the EDM Council’s website and will be divided into two areas: material that is under review and material that has been validated by industry experts. The EDM Council hopes that the repository will be used as a common set of business terms by financial institutions, vendors and regulators to reduce the amount of mapping required during data transference.

“So far we have got the project funded and initiated and we have used all the source material that we have access to come up with draft repository. We used about six or seven source documents from which we compared and normalised the data to come up with our own definitions. We are now asking people to comment and change the entries in the draft with a full audit trail. We are also setting up subject matter expert groups to look through the amendments and get agreement on the definitions,” says Atkin.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to organise, integrate and structure data for successful AI

25 September 2025 11:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Artificial intelligence (AI) is increasingly being rolled out across financial institutions, being put to work in applications that are transforming everything from back-office data management to front-office trading platforms. The potential for AI to bring further cost-savings and operational gains are...

BLOG

Spend, Spend, Spend: 2025 Set to be a Year of Bigger Data Budgets

Next year will be one of rising data expenditure by financial institutions as artificial intelligence (AI)-led applications flourish, according to separate surveys. Nevertheless, most aren’t prepared for AI adoption, with organisations having neither the skillsets nor regulatory processes in place, according to another survey. The final flurry of industry studies for 2024 suggest that financial...

EVENT

AI in Capital Markets Summit New York

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...