About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

How to Implement Data Quality Metrics that Deliver for the Business

Subscribe to our newsletter

Data quality metrics are essential to financial firms and can deliver significant operational benefits that are increasingly driven by emerging technologies such as machine learning. The business case for metrics, best approaches to implementation, and the potential of technology to improve both data quality and metrics were discussed during a recent A-Team Group webinar, Using Metrics to Measure Data Quality.

The webinar was moderated by A-Team editor, Sarah Underwood, and joined by Sue Geuens, president of DAMA and head of data standards and best practice adoption at Barclays; Mark Wilson, head of data quality UK at Handelsbanken; and John Randles, CEO at Bloomberg PolarLake. Setting the scene, an audience poll showed a large majority of respondents saying data quality metrics are very important across all data or across some data.

The speakers responded to the poll noting that metrics are key to the success of a data management programme and can also highlight poor data. Considering what is meant by data quality and the requirement for metrics, Randles commented: “Data quality is about whether a dataset is good enough for a particular application. Combining the definition of an appropriate dataset with metrics makes it possible to measure the tolerance of the data for the application.”

Looking at the business case for data quality metrics and a data quality programme, Wilson said: “There is the stick approach to keep regulators happy, but you are taking your eye of the ball if you follow that approach. You need to build a case around outcomes for the business, such as customer satisfaction and retention.” Geuens added: “If you align data quality and metrics with business strategy, and do it well, regulatory requirements will fall into place.”

On delivering data quality metrics that are relevant to the business, Randles explained: “It’s important when planning data metrics to make sure the business can understand the purpose of the data. Then you can add data quality scores. When you can expose that kind of metric, users can have confidence in the data before it is used or understand whether it needs to be improved. This keeps metrics fresh and alive.”

On the question of what firms should measure, Geuens said: “It’s easy to measure data accuracy across fields. But this is the wrong answer. You need to measure how quality has got better in terms of what the data is used for. For example, is the data more useful to an application when it is cleaned and validated?”.

Turning to technology, an audience poll showed 33% of respondents expecting to implement machine learning to improve data quality metrics, 28% third-party services, 23% vendor monitoring solutions and 19% artificial intelligence. Some 25% said they would implement no additional technology.

Final advice from the speakers on implementing data quality metrics included: talk to the business rather than IT; pick pain points affecting the business and deliver a successful solution to gain more interest; and publish metrics, statistics and reports to attract management attention.

Listen to the webinar to find out about:

  • Requirements for data quality metrics
  • Best practice implementation
  • Quick wins for the business
  • Emerging technology solutions
  • Benefits of metrics
Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

4 September 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and...

BLOG

Rethinking Data Management in Financial Services: Virtualisation Over Static Storage

By Thomas McHugh, Co-Founder and Chief Executive, FINBOURNE Technology. In Financial Services (FS), data management has long been centred around traditional database storage. However, this approach is fundamentally misaligned with the nature of FS data, which is process-driven rather than static. The industry needs a shift in perspective – one that prioritises virtualisation over rigid...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...