About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

EDM Council Takes Baby Steps Towards Ontology and Metrics

Subscribe to our newsletter

The EDM Council has recently taken steps towards achieving two key requirements for industry-wide implementation of enterprise data management – a standardised data ontology and data quality metrics.

Members of the EDM Council are participating in a working group formed by the DTCC to define and standardise the data terms and definitions included in its New Issue Information Dissemination Service (NIIDS).

According to the council, standardisation of the nomenclature and definitions of financial attributes, along with understanding their practical business relationships, is a “foundational” requirement for EDM. It says its role in this project is to ensure alignment between the DTCC initiative and ISO’s Standard Data Model activity – ISO 19312 – under way for some time (Reference Data Review, October 2006).

Mike Atkin, managing director of the EDM Council, applauds the DTCC’s recognition of the value of standardising nomenclature, but says the council feels there is a “disconnection between the ISO proposal and the practical requirements of the financial institutions”.

“We don’t want to do any work that we don’t have to,” he says, “and the good news is that all the previous work has not been for nought: the work of ISO on the standard data model, and on MDDL, and the glossary initiative et cetera is all good raw material. But the industry is not looking for an XML schema or a UML data model. It is looking for a ‘data dictionary’ – of business terms, not tags, giving precision around data meaning – which can then be used as a common key internally, and from which XML schemas and data models and messaging can be generated.”

What the industry wants, Atkin says, is “a consistent data ontology – terms and definitions in the context of business use”. “All firms currently have, or are working to create, a consistent ‘data dictionary’ within their own organisations – common terminology and definitions. The goal is to ensure the definitions they are using internally are consistent with the terms used externally.”

Atkin accepts that what has been completed “is a specific, discrete activity with DTCC for the DTCC NIIDS feed”. “While this effort has industry-wide application its scope remains limited to the DTCC feed. That’s why we are anxious to leverage it more broadly. We are trying to figure out how to get ISO engaged in the taxonomy objectives – but if we have trouble making the case, there are other options worth pursuing. XBRL, for example, has the capability to become the repository and is interested in the activity.”

The intention now is to confirm data ontology objectives with its members, and then to present the results of its findings to ISO, Atkin says. “We expect ISO to be receptive, but if they are unwilling to modify their 19312 approach we can and will look at other options.”

On the metrics front, the council has taken another “baby step” with the completion of its first data quality metrics pilot – designed to help firms create a fact-based data quality baseline to support the business case for EDM. The research – facilitated by council sponsor IBM Global Business Services – focused on 42 core security description and derived data elements for trade matching and confirmations from 13 firms.

According to Atkin, the level of data variance established was surprisingly high. Data discrepancies ranged between four and 30 per cent and included missing issues, missing data elements, inconsistent coding and mismatching data values.

While he admits that what the council has achieved with the pilot “is not a panacea”, he says he and council members “are pretty pleased”.

“We have proved that we have a legitimate methodology, we have gained the confidence of all the different parties, we have created a process that looks extensible, in an area that everybody wants to pursue.” Metrics cannot be solved “in one fell swoop”, he continues – “it’s a lifetime activity” – but the council is now in a position to expand the activity.

“We have a number of options on where the members take this research next including expansion of the number of instruments, extension to other security types and the addition of more data attributes. We are also talking about root cause analysis and translating the findings into their performance implications to determine exactly how bad data affects downstream data processing.”
Atkin is also keen to move the discussion beyond “negative metrics”. “I also want us to talk about the positive metrics – what I call the golden ring of EDM – metrics to support better cross-selling, better product engineering, better revenue generation. Ultimately firms want to take advantage of clean data to grow their businesses,” he says.The EDM Council has recently taken steps towards achieving two key requirements for industry-wide implementation of enterprise data management – a standardised data ontology and data quality metrics.

Members of the EDM Council are participating in a working group formed by the DTCC to define and standardise the data terms and definitions included in its New Issue Information Dissemination Service (NIIDS).

According to the council, standardisation of the nomenclature and definitions of financial attributes, along with understanding their practical business relationships, is a “foundational” requirement for EDM. It says its role in this project is to ensure alignment between the DTCC initiative and ISO’s Standard Data Model activity – ISO 19312 – under way for some time (Reference Data Review, October 2006).

Mike Atkin, managing director of the EDM Council, applauds the DTCC’s recognition of the value of standardising nomenclature, but says the council feels there is a “disconnection between the ISO proposal and the practical requirements of the financial institutions”.

“We don’t want to do any work that we don’t have to,” he says, “and the good news is that all the previous work has not been for nought: the work of ISO on the standard data model, and on MDDL, and the glossary initiative et cetera is all good raw material. But the industry is not looking for an XML schema or a UML data model. It is looking for a ‘data dictionary’ – of business terms, not tags, giving precision around data meaning – which can then be used as a common key internally, and from which XML schemas and data models and messaging can be generated.”

What the industry wants, Atkin says, is “a consistent data ontology – terms and definitions in the context of business use”. “All firms currently have, or are working to create, a consistent ‘data dictionary’ within their own organisations – common terminology and definitions. The goal is to ensure the definitions they are using internally are consistent with the terms used externally.”

Atkin accepts that what has been completed “is a specific, discrete activity with DTCC for the DTCC NIIDS feed”. “While this effort has industry-wide application its scope remains limited to the DTCC feed. That’s why we are anxious to leverage it more broadly. We are trying to figure out how to get ISO engaged in the taxonomy objectives – but if we have trouble making the case, there are other options worth pursuing. XBRL, for example, has the capability to become the repository and is interested in the activity.”

The intention now is to confirm data ontology objectives with its members, and then to present the results of its findings to ISO, Atkin says. “We expect ISO to be receptive, but if they are unwilling to modify their 19312 approach we can and will look at other options.”

On the metrics front, the council has taken another “baby step” with the completion of its first data quality metrics pilot – designed to help firms create a fact-based data quality baseline to support the business case for EDM. The research – facilitated by council sponsor IBM Global Business Services – focused on 42 core security description and derived data elements for trade matching and confirmations from 13 firms.

According to Atkin, the level of data variance established was surprisingly high. Data discrepancies ranged between four and 30 per cent and included missing issues, missing data elements, inconsistent coding and mismatching data values.

While he admits that what the council has achieved with the pilot “is not a panacea”, he says he and council members “are pretty pleased”.

“We have proved that we have a legitimate methodology, we have gained the confidence of all the different parties, we have created a process that looks extensible, in an area that everybody wants to pursue.” Metrics cannot be solved “in one fell swoop”, he continues – “it’s a lifetime activity” – but the council is now in a position to expand the activity.

“We have a number of options on where the members take this research next including expansion of the number of instruments, extension to other security types and the addition of more data attributes. We are also talking about root cause analysis and translating the findings into their performance implications to determine exactly how bad data affects downstream data processing.”
Atkin is also keen to move the discussion beyond “negative metrics”. “I also want us to talk about the positive metrics – what I call the golden ring of EDM – metrics to support better cross-selling, better product engineering, better revenue generation. Ultimately firms want to take advantage of clean data to grow their businesses,” he says.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: The time is now for buy-side firms to re-evaluate their approach to data management

Increased cost pressures, rising volumes of data, and the challenges of legacy systems are pushing buy-side firms to re-evaluate current approaches to data management. The aim is cost-effective, optimised data management that can provide flexibility and scalability, support various data types including ESG data, and ensure headroom for development in line with business objectives. Achieving...

BLOG

Ataccama Upgrades Data Catalog, Adds Data Observability, Offers Data Quality for Snowflake in Latest Release of Ataccama ONE Platform

Ataccama, provider of a unified data management platform, has released early access to version 14 of its Ataccama ONE platform. At the heart of the release is an upgraded data catalog with greater capabilities for collaboration, compliance and data activation, as well as a new data observability module built for enterprise use. The release also...

EVENT

FinCrime Tech Briefing, New York

RegTech Insight (from A-Team Group) is proud to announce the launch of its FinCrime Tech Briefing taking place in both London and New York this summer and focusing on RegTech for AML and Financial Crime Compliance.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...