About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

How to Implement Data Quality Metrics that Deliver for the Business

Subscribe to our newsletter

Data quality metrics are essential to financial firms and can deliver significant operational benefits that are increasingly driven by emerging technologies such as machine learning. The business case for metrics, best approaches to implementation, and the potential of technology to improve both data quality and metrics were discussed during a recent A-Team Group webinar, Using Metrics to Measure Data Quality.

The webinar was moderated by A-Team editor, Sarah Underwood, and joined by Sue Geuens, president of DAMA and head of data standards and best practice adoption at Barclays; Mark Wilson, head of data quality UK at Handelsbanken; and John Randles, CEO at Bloomberg PolarLake. Setting the scene, an audience poll showed a large majority of respondents saying data quality metrics are very important across all data or across some data.

The speakers responded to the poll noting that metrics are key to the success of a data management programme and can also highlight poor data. Considering what is meant by data quality and the requirement for metrics, Randles commented: “Data quality is about whether a dataset is good enough for a particular application. Combining the definition of an appropriate dataset with metrics makes it possible to measure the tolerance of the data for the application.”

Looking at the business case for data quality metrics and a data quality programme, Wilson said: “There is the stick approach to keep regulators happy, but you are taking your eye of the ball if you follow that approach. You need to build a case around outcomes for the business, such as customer satisfaction and retention.” Geuens added: “If you align data quality and metrics with business strategy, and do it well, regulatory requirements will fall into place.”

On delivering data quality metrics that are relevant to the business, Randles explained: “It’s important when planning data metrics to make sure the business can understand the purpose of the data. Then you can add data quality scores. When you can expose that kind of metric, users can have confidence in the data before it is used or understand whether it needs to be improved. This keeps metrics fresh and alive.”

On the question of what firms should measure, Geuens said: “It’s easy to measure data accuracy across fields. But this is the wrong answer. You need to measure how quality has got better in terms of what the data is used for. For example, is the data more useful to an application when it is cleaned and validated?”.

Turning to technology, an audience poll showed 33% of respondents expecting to implement machine learning to improve data quality metrics, 28% third-party services, 23% vendor monitoring solutions and 19% artificial intelligence. Some 25% said they would implement no additional technology.

Final advice from the speakers on implementing data quality metrics included: talk to the business rather than IT; pick pain points affecting the business and deliver a successful solution to gain more interest; and publish metrics, statistics and reports to attract management attention.

Listen to the webinar to find out about:

  • Requirements for data quality metrics
  • Best practice implementation
  • Quick wins for the business
  • Emerging technology solutions
  • Benefits of metrics
Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: End-to-End Lineage for Financial Services: The Missing Link for Both Compliance and AI Readiness

The importance of complete robust end-to-end data lineage in financial services and capital markets cannot be overstated. Without the ability to trace and verify data across its lifecycle, many critical workflows – from trade reconciliation to risk management – cannot be executed effectively. At the top of the list is regulatory compliance. Regulators demand a...

BLOG

Data Quality Still Troubling Private Market Investors: Webinar Review

Obtaining and managing data remains a sticking point for investors in private and alternative assets as financial institutions sink more of their capital into the markets. In a poll of viewers during a recent A-Team LIVE Data Management Insight webinar, respondents said the single-biggest challenge to managing private markets data was a lack of transparency...

EVENT

TradingTech Summit New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

FRTB Special Report

FRTB is one of the most sweeping and transformative pieces of regulation to hit the financial markets in the last two decades. With the deadline confirmed as January 2022, this Special Report provides a detailed insight into exactly what the data requirements are for FRTB in its latest (and final) incarnation, and explores what needs to be done in order to meet these needs on a cost-effective and company-wide basis.