The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

IMF Publishes Possible Revisions to its Data Quality Assessment Framework

Given the regulatory community’s crackdown on data quality across the financial services industry, the International Monetary Fund’s (IMF) recently published paper on the improvement of its data quality assessment framework indicators is judiciously timed. In the paper, the IMF’s statistics department suggests improvements to its current set of metrics against which to measure the quality, accuracy and reliability of data gathered during a supervisory endeavour.

Although the IMF’s data quality measurement focus is largely on macroeconomic data for a specific purpose, the lessons in data quality are applicable to much of the other work going on across the regulatory spectrum. Its data quality assessment framework has been developed to provide a framework for a uniform and standardised assessment of data quality and improvements of data compilation and dissemination practices; something that many regulators are focusing on in the search for a better way to evaluate systemic risk.

For example, the European Systemic Risk Board (ESRB) and the US Office of Financial Research will need to regularly evaluate their data quality checking practices, as well as measuring those of the firms they are monitoring. After all, both are charged with collecting the data on which important judgements must be made with regards to systemic risk.

The IMF’s framework currently examines five dimensions of data quality: prerequisites of quality, assurance of integrity, methodological soundness, accuracy and reliability, serviceability and accessibility. The paper, which has been penned by Mico Mrkaic from the IMF’s statistics department, examines whether these are appropriate metrics to use and suggests other possible variables to consider and various practical examples.

Related content

WEBINAR

Recorded Webinar: Entity identification and client lifecycle management – How financial institutions can drive $4 billion in cost savings

A new model in Legal Entity Identifier (LEI) issuance has created significant opportunities for financial institutions to capitalise on their KYC and AML due diligence. By becoming Validation Agents and obtaining LEIs on behalf of their clients, financial institutions can enhance their client onboarding experience, streamline their internal operations, and open the door to new,...

BLOG

Alveo and AquaQ Partner to Integrate Alveo Prime with AquaQ kdb+

Alveo and AquaQ Analytics have partnered to offer advanced data management and analytics for financial services firms. An early deliverable is the integration of Alveo’s Prime data mastering and data quality management solution with AquaQ’s kdb+ data capture solution. The bi-directional integration allows users to take mastered pricing and reference data from Prime into kdb+...

EVENT

Data Management Summit Virtual

The Data Management Summit Virtual will bring together the global data management community to share lessons learned, best practice guidance and latest innovations to emerge from the recent crisis. Join us online to hear from leading data practitioners and innovators from the UK, US and Europe who will share insights into how they are pushing the boundaries with data to deliver value with flexible but resilient data driven strategies.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...