About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

EDM Council Takes Baby Steps Towards Ontology and Metrics

Subscribe to our newsletter

The EDM Council has recently taken steps towards achieving two key requirements for industry-wide implementation of enterprise data management – a standardised data ontology and data quality metrics.

Members of the EDM Council are participating in a working group formed by the DTCC to define and standardise the data terms and definitions included in its New Issue Information Dissemination Service (NIIDS).

According to the council, standardisation of the nomenclature and definitions of financial attributes, along with understanding their practical business relationships, is a “foundational” requirement for EDM. It says its role in this project is to ensure alignment between the DTCC initiative and ISO’s Standard Data Model activity – ISO 19312 – under way for some time (Reference Data Review, October 2006).

Mike Atkin, managing director of the EDM Council, applauds the DTCC’s recognition of the value of standardising nomenclature, but says the council feels there is a “disconnection between the ISO proposal and the practical requirements of the financial institutions”.

“We don’t want to do any work that we don’t have to,” he says, “and the good news is that all the previous work has not been for nought: the work of ISO on the standard data model, and on MDDL, and the glossary initiative et cetera is all good raw material. But the industry is not looking for an XML schema or a UML data model. It is looking for a ‘data dictionary’ – of business terms, not tags, giving precision around data meaning – which can then be used as a common key internally, and from which XML schemas and data models and messaging can be generated.”

What the industry wants, Atkin says, is “a consistent data ontology – terms and definitions in the context of business use”. “All firms currently have, or are working to create, a consistent ‘data dictionary’ within their own organisations – common terminology and definitions. The goal is to ensure the definitions they are using internally are consistent with the terms used externally.”

Atkin accepts that what has been completed “is a specific, discrete activity with DTCC for the DTCC NIIDS feed”. “While this effort has industry-wide application its scope remains limited to the DTCC feed. That’s why we are anxious to leverage it more broadly. We are trying to figure out how to get ISO engaged in the taxonomy objectives – but if we have trouble making the case, there are other options worth pursuing. XBRL, for example, has the capability to become the repository and is interested in the activity.”

The intention now is to confirm data ontology objectives with its members, and then to present the results of its findings to ISO, Atkin says. “We expect ISO to be receptive, but if they are unwilling to modify their 19312 approach we can and will look at other options.”

On the metrics front, the council has taken another “baby step” with the completion of its first data quality metrics pilot – designed to help firms create a fact-based data quality baseline to support the business case for EDM. The research – facilitated by council sponsor IBM Global Business Services – focused on 42 core security description and derived data elements for trade matching and confirmations from 13 firms.

According to Atkin, the level of data variance established was surprisingly high. Data discrepancies ranged between four and 30 per cent and included missing issues, missing data elements, inconsistent coding and mismatching data values.

While he admits that what the council has achieved with the pilot “is not a panacea”, he says he and council members “are pretty pleased”.

“We have proved that we have a legitimate methodology, we have gained the confidence of all the different parties, we have created a process that looks extensible, in an area that everybody wants to pursue.” Metrics cannot be solved “in one fell swoop”, he continues – “it’s a lifetime activity” – but the council is now in a position to expand the activity.

“We have a number of options on where the members take this research next including expansion of the number of instruments, extension to other security types and the addition of more data attributes. We are also talking about root cause analysis and translating the findings into their performance implications to determine exactly how bad data affects downstream data processing.”
Atkin is also keen to move the discussion beyond “negative metrics”. “I also want us to talk about the positive metrics – what I call the golden ring of EDM – metrics to support better cross-selling, better product engineering, better revenue generation. Ultimately firms want to take advantage of clean data to grow their businesses,” he says.The EDM Council has recently taken steps towards achieving two key requirements for industry-wide implementation of enterprise data management – a standardised data ontology and data quality metrics.

Members of the EDM Council are participating in a working group formed by the DTCC to define and standardise the data terms and definitions included in its New Issue Information Dissemination Service (NIIDS).

According to the council, standardisation of the nomenclature and definitions of financial attributes, along with understanding their practical business relationships, is a “foundational” requirement for EDM. It says its role in this project is to ensure alignment between the DTCC initiative and ISO’s Standard Data Model activity – ISO 19312 – under way for some time (Reference Data Review, October 2006).

Mike Atkin, managing director of the EDM Council, applauds the DTCC’s recognition of the value of standardising nomenclature, but says the council feels there is a “disconnection between the ISO proposal and the practical requirements of the financial institutions”.

“We don’t want to do any work that we don’t have to,” he says, “and the good news is that all the previous work has not been for nought: the work of ISO on the standard data model, and on MDDL, and the glossary initiative et cetera is all good raw material. But the industry is not looking for an XML schema or a UML data model. It is looking for a ‘data dictionary’ – of business terms, not tags, giving precision around data meaning – which can then be used as a common key internally, and from which XML schemas and data models and messaging can be generated.”

What the industry wants, Atkin says, is “a consistent data ontology – terms and definitions in the context of business use”. “All firms currently have, or are working to create, a consistent ‘data dictionary’ within their own organisations – common terminology and definitions. The goal is to ensure the definitions they are using internally are consistent with the terms used externally.”

Atkin accepts that what has been completed “is a specific, discrete activity with DTCC for the DTCC NIIDS feed”. “While this effort has industry-wide application its scope remains limited to the DTCC feed. That’s why we are anxious to leverage it more broadly. We are trying to figure out how to get ISO engaged in the taxonomy objectives – but if we have trouble making the case, there are other options worth pursuing. XBRL, for example, has the capability to become the repository and is interested in the activity.”

The intention now is to confirm data ontology objectives with its members, and then to present the results of its findings to ISO, Atkin says. “We expect ISO to be receptive, but if they are unwilling to modify their 19312 approach we can and will look at other options.”

On the metrics front, the council has taken another “baby step” with the completion of its first data quality metrics pilot – designed to help firms create a fact-based data quality baseline to support the business case for EDM. The research – facilitated by council sponsor IBM Global Business Services – focused on 42 core security description and derived data elements for trade matching and confirmations from 13 firms.

According to Atkin, the level of data variance established was surprisingly high. Data discrepancies ranged between four and 30 per cent and included missing issues, missing data elements, inconsistent coding and mismatching data values.

While he admits that what the council has achieved with the pilot “is not a panacea”, he says he and council members “are pretty pleased”.

“We have proved that we have a legitimate methodology, we have gained the confidence of all the different parties, we have created a process that looks extensible, in an area that everybody wants to pursue.” Metrics cannot be solved “in one fell swoop”, he continues – “it’s a lifetime activity” – but the council is now in a position to expand the activity.

“We have a number of options on where the members take this research next including expansion of the number of instruments, extension to other security types and the addition of more data attributes. We are also talking about root cause analysis and translating the findings into their performance implications to determine exactly how bad data affects downstream data processing.”
Atkin is also keen to move the discussion beyond “negative metrics”. “I also want us to talk about the positive metrics – what I call the golden ring of EDM – metrics to support better cross-selling, better product engineering, better revenue generation. Ultimately firms want to take advantage of clean data to grow their businesses,” he says.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: How to organise, integrate and structure data for successful AI

25 September 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Artificial intelligence (AI) is increasingly being rolled out across financial institutions, being put to work in applications that are transforming everything from back-office data management to front-office trading platforms. The potential for AI to bring further cost-savings and operational gains are...

BLOG

Challenges of the New Regulatory Landscape: Data Management Summit London Preview

The regulatory landscape for financial institutions has rarely been in greater flux than now, placing new challenges on the technology and data that will be critical to satisfying the requirements of overseers. While digital innovations are offering organisations the opportunity to meet their compliance obligations with greater accuracy and efficiency, they are also encouraging regulators...

EVENT

AI in Capital Markets Summit New York

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...