The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

EDM Council Takes Baby Steps Towards Ontology and Metrics

Share article

The EDM Council has recently taken steps towards achieving two key requirements for industry-wide implementation of enterprise data management – a standardised data ontology and data quality metrics.

Members of the EDM Council are participating in a working group formed by the DTCC to define and standardise the data terms and definitions included in its New Issue Information Dissemination Service (NIIDS).

According to the council, standardisation of the nomenclature and definitions of financial attributes, along with understanding their practical business relationships, is a “foundational” requirement for EDM. It says its role in this project is to ensure alignment between the DTCC initiative and ISO’s Standard Data Model activity – ISO 19312 – under way for some time (Reference Data Review, October 2006).

Mike Atkin, managing director of the EDM Council, applauds the DTCC’s recognition of the value of standardising nomenclature, but says the council feels there is a “disconnection between the ISO proposal and the practical requirements of the financial institutions”.

“We don’t want to do any work that we don’t have to,” he says, “and the good news is that all the previous work has not been for nought: the work of ISO on the standard data model, and on MDDL, and the glossary initiative et cetera is all good raw material. But the industry is not looking for an XML schema or a UML data model. It is looking for a ‘data dictionary’ – of business terms, not tags, giving precision around data meaning – which can then be used as a common key internally, and from which XML schemas and data models and messaging can be generated.”

What the industry wants, Atkin says, is “a consistent data ontology – terms and definitions in the context of business use”. “All firms currently have, or are working to create, a consistent ‘data dictionary’ within their own organisations – common terminology and definitions. The goal is to ensure the definitions they are using internally are consistent with the terms used externally.”

Atkin accepts that what has been completed “is a specific, discrete activity with DTCC for the DTCC NIIDS feed”. “While this effort has industry-wide application its scope remains limited to the DTCC feed. That’s why we are anxious to leverage it more broadly. We are trying to figure out how to get ISO engaged in the taxonomy objectives – but if we have trouble making the case, there are other options worth pursuing. XBRL, for example, has the capability to become the repository and is interested in the activity.”

The intention now is to confirm data ontology objectives with its members, and then to present the results of its findings to ISO, Atkin says. “We expect ISO to be receptive, but if they are unwilling to modify their 19312 approach we can and will look at other options.”

On the metrics front, the council has taken another “baby step” with the completion of its first data quality metrics pilot – designed to help firms create a fact-based data quality baseline to support the business case for EDM. The research – facilitated by council sponsor IBM Global Business Services – focused on 42 core security description and derived data elements for trade matching and confirmations from 13 firms.

According to Atkin, the level of data variance established was surprisingly high. Data discrepancies ranged between four and 30 per cent and included missing issues, missing data elements, inconsistent coding and mismatching data values.

While he admits that what the council has achieved with the pilot “is not a panacea”, he says he and council members “are pretty pleased”.

“We have proved that we have a legitimate methodology, we have gained the confidence of all the different parties, we have created a process that looks extensible, in an area that everybody wants to pursue.” Metrics cannot be solved “in one fell swoop”, he continues – “it’s a lifetime activity” – but the council is now in a position to expand the activity.

“We have a number of options on where the members take this research next including expansion of the number of instruments, extension to other security types and the addition of more data attributes. We are also talking about root cause analysis and translating the findings into their performance implications to determine exactly how bad data affects downstream data processing.”
Atkin is also keen to move the discussion beyond “negative metrics”. “I also want us to talk about the positive metrics – what I call the golden ring of EDM – metrics to support better cross-selling, better product engineering, better revenue generation. Ultimately firms want to take advantage of clean data to grow their businesses,” he says.The EDM Council has recently taken steps towards achieving two key requirements for industry-wide implementation of enterprise data management – a standardised data ontology and data quality metrics.

Members of the EDM Council are participating in a working group formed by the DTCC to define and standardise the data terms and definitions included in its New Issue Information Dissemination Service (NIIDS).

According to the council, standardisation of the nomenclature and definitions of financial attributes, along with understanding their practical business relationships, is a “foundational” requirement for EDM. It says its role in this project is to ensure alignment between the DTCC initiative and ISO’s Standard Data Model activity – ISO 19312 – under way for some time (Reference Data Review, October 2006).

Mike Atkin, managing director of the EDM Council, applauds the DTCC’s recognition of the value of standardising nomenclature, but says the council feels there is a “disconnection between the ISO proposal and the practical requirements of the financial institutions”.

“We don’t want to do any work that we don’t have to,” he says, “and the good news is that all the previous work has not been for nought: the work of ISO on the standard data model, and on MDDL, and the glossary initiative et cetera is all good raw material. But the industry is not looking for an XML schema or a UML data model. It is looking for a ‘data dictionary’ – of business terms, not tags, giving precision around data meaning – which can then be used as a common key internally, and from which XML schemas and data models and messaging can be generated.”

What the industry wants, Atkin says, is “a consistent data ontology – terms and definitions in the context of business use”. “All firms currently have, or are working to create, a consistent ‘data dictionary’ within their own organisations – common terminology and definitions. The goal is to ensure the definitions they are using internally are consistent with the terms used externally.”

Atkin accepts that what has been completed “is a specific, discrete activity with DTCC for the DTCC NIIDS feed”. “While this effort has industry-wide application its scope remains limited to the DTCC feed. That’s why we are anxious to leverage it more broadly. We are trying to figure out how to get ISO engaged in the taxonomy objectives – but if we have trouble making the case, there are other options worth pursuing. XBRL, for example, has the capability to become the repository and is interested in the activity.”

The intention now is to confirm data ontology objectives with its members, and then to present the results of its findings to ISO, Atkin says. “We expect ISO to be receptive, but if they are unwilling to modify their 19312 approach we can and will look at other options.”

On the metrics front, the council has taken another “baby step” with the completion of its first data quality metrics pilot – designed to help firms create a fact-based data quality baseline to support the business case for EDM. The research – facilitated by council sponsor IBM Global Business Services – focused on 42 core security description and derived data elements for trade matching and confirmations from 13 firms.

According to Atkin, the level of data variance established was surprisingly high. Data discrepancies ranged between four and 30 per cent and included missing issues, missing data elements, inconsistent coding and mismatching data values.

While he admits that what the council has achieved with the pilot “is not a panacea”, he says he and council members “are pretty pleased”.

“We have proved that we have a legitimate methodology, we have gained the confidence of all the different parties, we have created a process that looks extensible, in an area that everybody wants to pursue.” Metrics cannot be solved “in one fell swoop”, he continues – “it’s a lifetime activity” – but the council is now in a position to expand the activity.

“We have a number of options on where the members take this research next including expansion of the number of instruments, extension to other security types and the addition of more data attributes. We are also talking about root cause analysis and translating the findings into their performance implications to determine exactly how bad data affects downstream data processing.”
Atkin is also keen to move the discussion beyond “negative metrics”. “I also want us to talk about the positive metrics – what I call the golden ring of EDM – metrics to support better cross-selling, better product engineering, better revenue generation. Ultimately firms want to take advantage of clean data to grow their businesses,” he says.

Related content

WEBINAR

Recorded Webinar: MiFID II: The Critical Need for a Strong Security Master to Meet Compliance

Don’t miss this opportunity to view the recording of this recently held webinar. Financial services firms are racing to meet the January 3, 2018 deadline for the EU’s Markets in Financial Instruments Directive II (MiFID II), perhaps the most profound regulatory overhaul of European financial markets for a generation. At the heart of MiFID II’s...

BLOG

FeedStock Joins Red Hat Container Ecosystem

AI-driven data analytics platform Feedstock has completed Red Hat Container certification for its FeedStock Connector and has been published in the Red Hat Ecosystem Catalog. The certification enables FeedStock to use Red Hat’s container registry and deploy Red Hat certified containers in FeedStock’s client environments, enabling frictionless deployment at scale of its automated data management solutions....

EVENT

Data Management Summit London

Now in its 10th year, the Data Management Summit (DMS) in London explores how financial institutions are shifting from defensive to offensive data management strategies, to improve operational efficiency and revenue enhancing opportunities. We’ll be putting the business lens on data and deep diving into the data management capabilities needed to deliver on business outcomes.

GUIDE

Dealing with Reality – How to Ensure Data Quality in the Changing Entity Identifier Landscape

“The Global LEI will be a marathon, not a sprint” is a phrase heard more than once during our series of Hot Topic webinars that’s charted the emergence of a standard identifier for entity data. Doubtless, it will be heard again. But if we’re not exactly sprinting, we are moving pretty swiftly. Every time I...