The Enterprise Data Management (EDM) Council’s 2017 Data Management Industry Benchmark Summary illustrates the highs and lows of data management across financial services firms. On the upside, firms have made progress in establishing data management programmes and implementing foundational governance. On the downside, data quality issues remain.
The Council’s 2017 benchmarking study was carried out in partnership with Sapient Consulting and Element-22/Pellustro. It consists of 22 questions derived from the Council’s data management capability model (DCAM) and makes comparisons to a previous study performed in 2015.
Headline results identify risk management and trust in data – or data quality – as key data management drivers across the industry. These drivers are reflected in data management priorities, which from a regulatory perspective include defining critical data elements (CDEs), improving data quality and implementing governance. From an ops perspective, top priorities are metrics and commitment from stakeholders, and from a sustainability perspective, ecosystem collaboration and technical integration.
The study shows some progress in data harmonisation across repositories, driven by BCBS 239, and similar progress on recognition of the importance of CDEs and the determination of CDE criteria.
While the study shows improvement across many aspects of data management, data quality remains a sticking point, with control mechanisms and checkpoints being defined and implemented, but in an uneven and bifurcated way at both early and advanced capability levels. Adding to the data quality challenge, little progress has been made on identifying and addressing root causes of data quality problems, although the industry is engaged in defining an approach to determine root causes.
Michael Atkin, managing director of the EDM Council, sums up the study, stating: “There are clearly some bright spots for the practice of data management. We have made progress in overcoming the inertia of organisational change management. But the underlying truth remains – we can’t respond to regulatory pressure, achieve automation or put data to work until we fix underlying data challenges.”