Given the regulatory community’s crackdown on data quality across the financial services industry, the International Monetary Fund’s (IMF) recently published paper on the improvement of its data quality assessment framework indicators is judiciously timed. In the paper, the IMF’s statistics department suggests improvements to its current set of metrics against which to measure the quality, accuracy and reliability of data gathered during a supervisory endeavour.
Although the IMF’s data quality measurement focus is largely on macroeconomic data for a specific purpose, the lessons in data quality are applicable to much of the other work going on across the regulatory spectrum. Its data quality assessment framework has been developed to provide a framework for a uniform and standardised assessment of data quality and improvements of data compilation and dissemination practices; something that many regulators are focusing on in the search for a better way to evaluate systemic risk.
For example, the European Systemic Risk Board (ESRB) and the US Office of Financial Research will need to regularly evaluate their data quality checking practices, as well as measuring those of the firms they are monitoring. After all, both are charged with collecting the data on which important judgements must be made with regards to systemic risk.
The IMF’s framework currently examines five dimensions of data quality: prerequisites of quality, assurance of integrity, methodological soundness, accuracy and reliability, serviceability and accessibility. The paper, which has been penned by Mico Mrkaic from the IMF’s statistics department, examines whether these are appropriate metrics to use and suggests other possible variables to consider and various practical examples.