About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

EDM Council Takes Baby Steps Towards Ontology and Metrics

Subscribe to our newsletter

The EDM Council has recently taken steps towards achieving two key requirements for industry-wide implementation of enterprise data management – a standardised data ontology and data quality metrics.

Members of the EDM Council are participating in a working group formed by the DTCC to define and standardise the data terms and definitions included in its New Issue Information Dissemination Service (NIIDS).

According to the council, standardisation of the nomenclature and definitions of financial attributes, along with understanding their practical business relationships, is a “foundational” requirement for EDM. It says its role in this project is to ensure alignment between the DTCC initiative and ISO’s Standard Data Model activity – ISO 19312 – under way for some time (Reference Data Review, October 2006).

Mike Atkin, managing director of the EDM Council, applauds the DTCC’s recognition of the value of standardising nomenclature, but says the council feels there is a “disconnection between the ISO proposal and the practical requirements of the financial institutions”.

“We don’t want to do any work that we don’t have to,” he says, “and the good news is that all the previous work has not been for nought: the work of ISO on the standard data model, and on MDDL, and the glossary initiative et cetera is all good raw material. But the industry is not looking for an XML schema or a UML data model. It is looking for a ‘data dictionary’ – of business terms, not tags, giving precision around data meaning – which can then be used as a common key internally, and from which XML schemas and data models and messaging can be generated.”

What the industry wants, Atkin says, is “a consistent data ontology – terms and definitions in the context of business use”. “All firms currently have, or are working to create, a consistent ‘data dictionary’ within their own organisations – common terminology and definitions. The goal is to ensure the definitions they are using internally are consistent with the terms used externally.”

Atkin accepts that what has been completed “is a specific, discrete activity with DTCC for the DTCC NIIDS feed”. “While this effort has industry-wide application its scope remains limited to the DTCC feed. That’s why we are anxious to leverage it more broadly. We are trying to figure out how to get ISO engaged in the taxonomy objectives – but if we have trouble making the case, there are other options worth pursuing. XBRL, for example, has the capability to become the repository and is interested in the activity.”

The intention now is to confirm data ontology objectives with its members, and then to present the results of its findings to ISO, Atkin says. “We expect ISO to be receptive, but if they are unwilling to modify their 19312 approach we can and will look at other options.”

On the metrics front, the council has taken another “baby step” with the completion of its first data quality metrics pilot – designed to help firms create a fact-based data quality baseline to support the business case for EDM. The research – facilitated by council sponsor IBM Global Business Services – focused on 42 core security description and derived data elements for trade matching and confirmations from 13 firms.

According to Atkin, the level of data variance established was surprisingly high. Data discrepancies ranged between four and 30 per cent and included missing issues, missing data elements, inconsistent coding and mismatching data values.

While he admits that what the council has achieved with the pilot “is not a panacea”, he says he and council members “are pretty pleased”.

“We have proved that we have a legitimate methodology, we have gained the confidence of all the different parties, we have created a process that looks extensible, in an area that everybody wants to pursue.” Metrics cannot be solved “in one fell swoop”, he continues – “it’s a lifetime activity” – but the council is now in a position to expand the activity.

“We have a number of options on where the members take this research next including expansion of the number of instruments, extension to other security types and the addition of more data attributes. We are also talking about root cause analysis and translating the findings into their performance implications to determine exactly how bad data affects downstream data processing.”
Atkin is also keen to move the discussion beyond “negative metrics”. “I also want us to talk about the positive metrics – what I call the golden ring of EDM – metrics to support better cross-selling, better product engineering, better revenue generation. Ultimately firms want to take advantage of clean data to grow their businesses,” he says.The EDM Council has recently taken steps towards achieving two key requirements for industry-wide implementation of enterprise data management – a standardised data ontology and data quality metrics.

Members of the EDM Council are participating in a working group formed by the DTCC to define and standardise the data terms and definitions included in its New Issue Information Dissemination Service (NIIDS).

According to the council, standardisation of the nomenclature and definitions of financial attributes, along with understanding their practical business relationships, is a “foundational” requirement for EDM. It says its role in this project is to ensure alignment between the DTCC initiative and ISO’s Standard Data Model activity – ISO 19312 – under way for some time (Reference Data Review, October 2006).

Mike Atkin, managing director of the EDM Council, applauds the DTCC’s recognition of the value of standardising nomenclature, but says the council feels there is a “disconnection between the ISO proposal and the practical requirements of the financial institutions”.

“We don’t want to do any work that we don’t have to,” he says, “and the good news is that all the previous work has not been for nought: the work of ISO on the standard data model, and on MDDL, and the glossary initiative et cetera is all good raw material. But the industry is not looking for an XML schema or a UML data model. It is looking for a ‘data dictionary’ – of business terms, not tags, giving precision around data meaning – which can then be used as a common key internally, and from which XML schemas and data models and messaging can be generated.”

What the industry wants, Atkin says, is “a consistent data ontology – terms and definitions in the context of business use”. “All firms currently have, or are working to create, a consistent ‘data dictionary’ within their own organisations – common terminology and definitions. The goal is to ensure the definitions they are using internally are consistent with the terms used externally.”

Atkin accepts that what has been completed “is a specific, discrete activity with DTCC for the DTCC NIIDS feed”. “While this effort has industry-wide application its scope remains limited to the DTCC feed. That’s why we are anxious to leverage it more broadly. We are trying to figure out how to get ISO engaged in the taxonomy objectives – but if we have trouble making the case, there are other options worth pursuing. XBRL, for example, has the capability to become the repository and is interested in the activity.”

The intention now is to confirm data ontology objectives with its members, and then to present the results of its findings to ISO, Atkin says. “We expect ISO to be receptive, but if they are unwilling to modify their 19312 approach we can and will look at other options.”

On the metrics front, the council has taken another “baby step” with the completion of its first data quality metrics pilot – designed to help firms create a fact-based data quality baseline to support the business case for EDM. The research – facilitated by council sponsor IBM Global Business Services – focused on 42 core security description and derived data elements for trade matching and confirmations from 13 firms.

According to Atkin, the level of data variance established was surprisingly high. Data discrepancies ranged between four and 30 per cent and included missing issues, missing data elements, inconsistent coding and mismatching data values.

While he admits that what the council has achieved with the pilot “is not a panacea”, he says he and council members “are pretty pleased”.

“We have proved that we have a legitimate methodology, we have gained the confidence of all the different parties, we have created a process that looks extensible, in an area that everybody wants to pursue.” Metrics cannot be solved “in one fell swoop”, he continues – “it’s a lifetime activity” – but the council is now in a position to expand the activity.

“We have a number of options on where the members take this research next including expansion of the number of instruments, extension to other security types and the addition of more data attributes. We are also talking about root cause analysis and translating the findings into their performance implications to determine exactly how bad data affects downstream data processing.”
Atkin is also keen to move the discussion beyond “negative metrics”. “I also want us to talk about the positive metrics – what I call the golden ring of EDM – metrics to support better cross-selling, better product engineering, better revenue generation. Ultimately firms want to take advantage of clean data to grow their businesses,” he says.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Best practice approaches to data management for regulatory reporting

13 May 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Effective regulatory reporting requires firms to manage vast amounts of data across multiple systems, regions, and regulatory jurisdictions. With increasing scrutiny from regulators and the rising complexity of financial instruments, the need for a streamlined and strategic approach to data management...

BLOG

Gulf Between AI Ambitions and Capabilities Remains Wide, Surveys Find

Many financial institutions and service providers remain encumbered by creaking technology systems that are preventing many from taking advantage of artificial intelligence (AI) data innovations. Despite organisations’ overwhelming desire to make use of AI to give them a competitive edge, many say also that they lack the data management expertise to adopt applications that are...

EVENT

TradingTech Summit MENA

The inaugural TradingTech Summit MENA takes place in November and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions in the region.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...