About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

How to Implement Data Quality Metrics that Deliver for the Business

Subscribe to our newsletter

Data quality metrics are essential to financial firms and can deliver significant operational benefits that are increasingly driven by emerging technologies such as machine learning. The business case for metrics, best approaches to implementation, and the potential of technology to improve both data quality and metrics were discussed during a recent A-Team Group webinar, Using Metrics to Measure Data Quality.

The webinar was moderated by A-Team editor, Sarah Underwood, and joined by Sue Geuens, president of DAMA and head of data standards and best practice adoption at Barclays; Mark Wilson, head of data quality UK at Handelsbanken; and John Randles, CEO at Bloomberg PolarLake. Setting the scene, an audience poll showed a large majority of respondents saying data quality metrics are very important across all data or across some data.

The speakers responded to the poll noting that metrics are key to the success of a data management programme and can also highlight poor data. Considering what is meant by data quality and the requirement for metrics, Randles commented: “Data quality is about whether a dataset is good enough for a particular application. Combining the definition of an appropriate dataset with metrics makes it possible to measure the tolerance of the data for the application.”

Looking at the business case for data quality metrics and a data quality programme, Wilson said: “There is the stick approach to keep regulators happy, but you are taking your eye of the ball if you follow that approach. You need to build a case around outcomes for the business, such as customer satisfaction and retention.” Geuens added: “If you align data quality and metrics with business strategy, and do it well, regulatory requirements will fall into place.”

On delivering data quality metrics that are relevant to the business, Randles explained: “It’s important when planning data metrics to make sure the business can understand the purpose of the data. Then you can add data quality scores. When you can expose that kind of metric, users can have confidence in the data before it is used or understand whether it needs to be improved. This keeps metrics fresh and alive.”

On the question of what firms should measure, Geuens said: “It’s easy to measure data accuracy across fields. But this is the wrong answer. You need to measure how quality has got better in terms of what the data is used for. For example, is the data more useful to an application when it is cleaned and validated?”.

Turning to technology, an audience poll showed 33% of respondents expecting to implement machine learning to improve data quality metrics, 28% third-party services, 23% vendor monitoring solutions and 19% artificial intelligence. Some 25% said they would implement no additional technology.

Final advice from the speakers on implementing data quality metrics included: talk to the business rather than IT; pick pain points affecting the business and deliver a successful solution to gain more interest; and publish metrics, statistics and reports to attract management attention.

Listen to the webinar to find out about:

  • Requirements for data quality metrics
  • Best practice implementation
  • Quick wins for the business
  • Emerging technology solutions
  • Benefits of metrics
Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Data lineage to drive compliance and as a business imperative

The importance of data lineage has escalated in recent years in response to regulatory demand and increased business understanding of the benefits it can deliver. Like all capital markets technology, data lineage presents both challenges and opportunities, so how best can it be implemented and sustained? And how can your organisation reap the rewards of...

BLOG

NeoXam Connects to Refinitiv Data Platform to Expand Provision of Reference, Pricing and ESG Data

NeoXam has extended its partnership with Refinitiv, a London Stock Exchange Group company,  to offer clients a wider range of data sourced for the first time from the Refinitiv Data Platform (RDP) that was released early last year. The data will include reference, pricing, and ESG data as well as data exclusive to the RDP...

EVENT

TradingTech Summit London

Now in its 11th year the TradingTech Summit London brings together the European trading technology capital markets industry, to explore how trading firms are innovating in today’s cloud and digital based environment to create flexible, scalable trading platforms to support speed to market and business agility.

GUIDE

MiFID II Handbook – Second Edition

With the compliance deadline for Markets in Financial Instruments Directive II (MiFID II) just over two months away, A-Team Group has updated its MiFID II handbook to bring you the latest details on the regulation’s compliance requirements. Version 2 of the handbook, commissioned by Thomson Reuters, also includes new sections covering data sourcing and data...