The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

EDM Council Takes Baby Steps Towards Ontology and Metrics

The EDM Council has recently taken steps towards achieving two key requirements for industry-wide implementation of enterprise data management – a standardised data ontology and data quality metrics.

Members of the EDM Council are participating in a working group formed by the DTCC to define and standardise the data terms and definitions included in its New Issue Information Dissemination Service (NIIDS).

According to the council, standardisation of the nomenclature and definitions of financial attributes, along with understanding their practical business relationships, is a “foundational” requirement for EDM. It says its role in this project is to ensure alignment between the DTCC initiative and ISO’s Standard Data Model activity – ISO 19312 – under way for some time (Reference Data Review, October 2006).

Mike Atkin, managing director of the EDM Council, applauds the DTCC’s recognition of the value of standardising nomenclature, but says the council feels there is a “disconnection between the ISO proposal and the practical requirements of the financial institutions”.

“We don’t want to do any work that we don’t have to,” he says, “and the good news is that all the previous work has not been for nought: the work of ISO on the standard data model, and on MDDL, and the glossary initiative et cetera is all good raw material. But the industry is not looking for an XML schema or a UML data model. It is looking for a ‘data dictionary’ – of business terms, not tags, giving precision around data meaning – which can then be used as a common key internally, and from which XML schemas and data models and messaging can be generated.”

What the industry wants, Atkin says, is “a consistent data ontology – terms and definitions in the context of business use”. “All firms currently have, or are working to create, a consistent ‘data dictionary’ within their own organisations – common terminology and definitions. The goal is to ensure the definitions they are using internally are consistent with the terms used externally.”

Atkin accepts that what has been completed “is a specific, discrete activity with DTCC for the DTCC NIIDS feed”. “While this effort has industry-wide application its scope remains limited to the DTCC feed. That’s why we are anxious to leverage it more broadly. We are trying to figure out how to get ISO engaged in the taxonomy objectives – but if we have trouble making the case, there are other options worth pursuing. XBRL, for example, has the capability to become the repository and is interested in the activity.”

The intention now is to confirm data ontology objectives with its members, and then to present the results of its findings to ISO, Atkin says. “We expect ISO to be receptive, but if they are unwilling to modify their 19312 approach we can and will look at other options.”

On the metrics front, the council has taken another “baby step” with the completion of its first data quality metrics pilot – designed to help firms create a fact-based data quality baseline to support the business case for EDM. The research – facilitated by council sponsor IBM Global Business Services – focused on 42 core security description and derived data elements for trade matching and confirmations from 13 firms.

According to Atkin, the level of data variance established was surprisingly high. Data discrepancies ranged between four and 30 per cent and included missing issues, missing data elements, inconsistent coding and mismatching data values.

While he admits that what the council has achieved with the pilot “is not a panacea”, he says he and council members “are pretty pleased”.

“We have proved that we have a legitimate methodology, we have gained the confidence of all the different parties, we have created a process that looks extensible, in an area that everybody wants to pursue.” Metrics cannot be solved “in one fell swoop”, he continues – “it’s a lifetime activity” – but the council is now in a position to expand the activity.

“We have a number of options on where the members take this research next including expansion of the number of instruments, extension to other security types and the addition of more data attributes. We are also talking about root cause analysis and translating the findings into their performance implications to determine exactly how bad data affects downstream data processing.”
Atkin is also keen to move the discussion beyond “negative metrics”. “I also want us to talk about the positive metrics – what I call the golden ring of EDM – metrics to support better cross-selling, better product engineering, better revenue generation. Ultimately firms want to take advantage of clean data to grow their businesses,” he says.The EDM Council has recently taken steps towards achieving two key requirements for industry-wide implementation of enterprise data management – a standardised data ontology and data quality metrics.

Members of the EDM Council are participating in a working group formed by the DTCC to define and standardise the data terms and definitions included in its New Issue Information Dissemination Service (NIIDS).

According to the council, standardisation of the nomenclature and definitions of financial attributes, along with understanding their practical business relationships, is a “foundational” requirement for EDM. It says its role in this project is to ensure alignment between the DTCC initiative and ISO’s Standard Data Model activity – ISO 19312 – under way for some time (Reference Data Review, October 2006).

Mike Atkin, managing director of the EDM Council, applauds the DTCC’s recognition of the value of standardising nomenclature, but says the council feels there is a “disconnection between the ISO proposal and the practical requirements of the financial institutions”.

“We don’t want to do any work that we don’t have to,” he says, “and the good news is that all the previous work has not been for nought: the work of ISO on the standard data model, and on MDDL, and the glossary initiative et cetera is all good raw material. But the industry is not looking for an XML schema or a UML data model. It is looking for a ‘data dictionary’ – of business terms, not tags, giving precision around data meaning – which can then be used as a common key internally, and from which XML schemas and data models and messaging can be generated.”

What the industry wants, Atkin says, is “a consistent data ontology – terms and definitions in the context of business use”. “All firms currently have, or are working to create, a consistent ‘data dictionary’ within their own organisations – common terminology and definitions. The goal is to ensure the definitions they are using internally are consistent with the terms used externally.”

Atkin accepts that what has been completed “is a specific, discrete activity with DTCC for the DTCC NIIDS feed”. “While this effort has industry-wide application its scope remains limited to the DTCC feed. That’s why we are anxious to leverage it more broadly. We are trying to figure out how to get ISO engaged in the taxonomy objectives – but if we have trouble making the case, there are other options worth pursuing. XBRL, for example, has the capability to become the repository and is interested in the activity.”

The intention now is to confirm data ontology objectives with its members, and then to present the results of its findings to ISO, Atkin says. “We expect ISO to be receptive, but if they are unwilling to modify their 19312 approach we can and will look at other options.”

On the metrics front, the council has taken another “baby step” with the completion of its first data quality metrics pilot – designed to help firms create a fact-based data quality baseline to support the business case for EDM. The research – facilitated by council sponsor IBM Global Business Services – focused on 42 core security description and derived data elements for trade matching and confirmations from 13 firms.

According to Atkin, the level of data variance established was surprisingly high. Data discrepancies ranged between four and 30 per cent and included missing issues, missing data elements, inconsistent coding and mismatching data values.

While he admits that what the council has achieved with the pilot “is not a panacea”, he says he and council members “are pretty pleased”.

“We have proved that we have a legitimate methodology, we have gained the confidence of all the different parties, we have created a process that looks extensible, in an area that everybody wants to pursue.” Metrics cannot be solved “in one fell swoop”, he continues – “it’s a lifetime activity” – but the council is now in a position to expand the activity.

“We have a number of options on where the members take this research next including expansion of the number of instruments, extension to other security types and the addition of more data attributes. We are also talking about root cause analysis and translating the findings into their performance implications to determine exactly how bad data affects downstream data processing.”
Atkin is also keen to move the discussion beyond “negative metrics”. “I also want us to talk about the positive metrics – what I call the golden ring of EDM – metrics to support better cross-selling, better product engineering, better revenue generation. Ultimately firms want to take advantage of clean data to grow their businesses,” he says.

Related content

WEBINAR

Recorded Webinar: Entity identification and client lifecycle management – How financial institutions can drive $4 billion in cost savings

A new model in Legal Entity Identifier (LEI) issuance has created significant opportunities for financial institutions to capitalise on their KYC and AML due diligence. By becoming Validation Agents and obtaining LEIs on behalf of their clients, financial institutions can enhance their client onboarding experience, streamline their internal operations, and open the door to new,...

BLOG

Alveo and AquaQ Partner to Integrate Alveo Prime with AquaQ kdb+

Alveo and AquaQ Analytics have partnered to offer advanced data management and analytics for financial services firms. An early deliverable is the integration of Alveo’s Prime data mastering and data quality management solution with AquaQ’s kdb+ data capture solution. The bi-directional integration allows users to take mastered pricing and reference data from Prime into kdb+...

EVENT

RegTech Summit Virtual

The RegTech Summit Virtual is a global online event that brings together an exceptional guest speaker line up of RegTech practitioners, regulators, start-ups and solution providers to collaborate and discuss innovative and effective approaches for building a better regulatory environment.

GUIDE

Risk & Compliance

The current financial climate has meant that risk management and compliance requirements are never far from the minds of the boards of financial institutions. In order to meet the slew of regulations on the horizon, firms are being compelled to invest in their systems in order to cope with the new requirements. Data management is...