About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

How to Implement Data Quality Metrics that Deliver for the Business

Subscribe to our newsletter

Data quality metrics are essential to financial firms and can deliver significant operational benefits that are increasingly driven by emerging technologies such as machine learning. The business case for metrics, best approaches to implementation, and the potential of technology to improve both data quality and metrics were discussed during a recent A-Team Group webinar, Using Metrics to Measure Data Quality.

The webinar was moderated by A-Team editor, Sarah Underwood, and joined by Sue Geuens, president of DAMA and head of data standards and best practice adoption at Barclays; Mark Wilson, head of data quality UK at Handelsbanken; and John Randles, CEO at Bloomberg PolarLake. Setting the scene, an audience poll showed a large majority of respondents saying data quality metrics are very important across all data or across some data.

The speakers responded to the poll noting that metrics are key to the success of a data management programme and can also highlight poor data. Considering what is meant by data quality and the requirement for metrics, Randles commented: “Data quality is about whether a dataset is good enough for a particular application. Combining the definition of an appropriate dataset with metrics makes it possible to measure the tolerance of the data for the application.”

Looking at the business case for data quality metrics and a data quality programme, Wilson said: “There is the stick approach to keep regulators happy, but you are taking your eye of the ball if you follow that approach. You need to build a case around outcomes for the business, such as customer satisfaction and retention.” Geuens added: “If you align data quality and metrics with business strategy, and do it well, regulatory requirements will fall into place.”

On delivering data quality metrics that are relevant to the business, Randles explained: “It’s important when planning data metrics to make sure the business can understand the purpose of the data. Then you can add data quality scores. When you can expose that kind of metric, users can have confidence in the data before it is used or understand whether it needs to be improved. This keeps metrics fresh and alive.”

On the question of what firms should measure, Geuens said: “It’s easy to measure data accuracy across fields. But this is the wrong answer. You need to measure how quality has got better in terms of what the data is used for. For example, is the data more useful to an application when it is cleaned and validated?”.

Turning to technology, an audience poll showed 33% of respondents expecting to implement machine learning to improve data quality metrics, 28% third-party services, 23% vendor monitoring solutions and 19% artificial intelligence. Some 25% said they would implement no additional technology.

Final advice from the speakers on implementing data quality metrics included: talk to the business rather than IT; pick pain points affecting the business and deliver a successful solution to gain more interest; and publish metrics, statistics and reports to attract management attention.

Listen to the webinar to find out about:

  • Requirements for data quality metrics
  • Best practice implementation
  • Quick wins for the business
  • Emerging technology solutions
  • Benefits of metrics
Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking Transparency in Private Markets: Data-Driven Strategies in Asset Management

As asset managers continue to increase their allocations in private assets, the demand for greater transparency, risk oversight, and operational efficiency is growing rapidly. Managing private markets data presents its own set of unique challenges due to a lack of transparency, disparate sources and lack of standardization. Without reliable access, your firm may face inefficiencies,...

BLOG

Why AI is Making Data Ownership a Business Imperative

By Edgar Randall, UK&I Managing Director, Dun & Bradstreet. As AI becomes the engine of modern business, the question of verifiable data ownership is no longer theoretical, it’s central to how organisations build trust in AI-driven decisions. The rise of AI means models depend entirely on the quality and integrity of the data they consume....

EVENT

AI in Capital Markets Summit London

Now in its 3rd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

Regulatory Data Handbook 2021/2022 – Ninth Edition

Welcome to the ninth edition of A-Team Group’s Regulatory Data Handbook, a publication dedicated to helping you gain a full understanding of regulations related to your organisation from the details of requirements to best practice implementation. This edition of the handbook includes a focus on regulations being rolled out to bring order and standardisation to...