About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

How to Implement Data Quality Metrics that Deliver for the Business

Subscribe to our newsletter

Data quality metrics are essential to financial firms and can deliver significant operational benefits that are increasingly driven by emerging technologies such as machine learning. The business case for metrics, best approaches to implementation, and the potential of technology to improve both data quality and metrics were discussed during a recent A-Team Group webinar, Using Metrics to Measure Data Quality.

The webinar was moderated by A-Team editor, Sarah Underwood, and joined by Sue Geuens, president of DAMA and head of data standards and best practice adoption at Barclays; Mark Wilson, head of data quality UK at Handelsbanken; and John Randles, CEO at Bloomberg PolarLake. Setting the scene, an audience poll showed a large majority of respondents saying data quality metrics are very important across all data or across some data.

The speakers responded to the poll noting that metrics are key to the success of a data management programme and can also highlight poor data. Considering what is meant by data quality and the requirement for metrics, Randles commented: “Data quality is about whether a dataset is good enough for a particular application. Combining the definition of an appropriate dataset with metrics makes it possible to measure the tolerance of the data for the application.”

Looking at the business case for data quality metrics and a data quality programme, Wilson said: “There is the stick approach to keep regulators happy, but you are taking your eye of the ball if you follow that approach. You need to build a case around outcomes for the business, such as customer satisfaction and retention.” Geuens added: “If you align data quality and metrics with business strategy, and do it well, regulatory requirements will fall into place.”

On delivering data quality metrics that are relevant to the business, Randles explained: “It’s important when planning data metrics to make sure the business can understand the purpose of the data. Then you can add data quality scores. When you can expose that kind of metric, users can have confidence in the data before it is used or understand whether it needs to be improved. This keeps metrics fresh and alive.”

On the question of what firms should measure, Geuens said: “It’s easy to measure data accuracy across fields. But this is the wrong answer. You need to measure how quality has got better in terms of what the data is used for. For example, is the data more useful to an application when it is cleaned and validated?”.

Turning to technology, an audience poll showed 33% of respondents expecting to implement machine learning to improve data quality metrics, 28% third-party services, 23% vendor monitoring solutions and 19% artificial intelligence. Some 25% said they would implement no additional technology.

Final advice from the speakers on implementing data quality metrics included: talk to the business rather than IT; pick pain points affecting the business and deliver a successful solution to gain more interest; and publish metrics, statistics and reports to attract management attention.

Listen to the webinar to find out about:

  • Requirements for data quality metrics
  • Best practice implementation
  • Quick wins for the business
  • Emerging technology solutions
  • Benefits of metrics
Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to optimise SaaS data management solutions

Software-as-a-Service (SaaS) data management solutions go hand-in-hand with cloud technology, delivering not only SaaS benefits of agility, a reduced on-premise footprint and access to third-party expertise, but also the fast data delivery, productivity and efficiency gains provided by the cloud. This webinar will focus on the essentials of SaaS data management, including practical guidance on...

BLOG

Duco Acquires Unstructured Data Management Specialist Metamaze

Duco, a provider of SaaS AI-powered data automation, has acquired Metamaze, an Antwerp, Belgium-based company offering an AI-driven intelligent document processing SaaS platform that automatically processes, extracts and interprets information from any type of unstructured document. By combining the Metamaze and Duco platforms, customers can ingest any type of data from any type of document...

EVENT

TradingTech Summit London

Now in its 13th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...