About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit: Techniques and Technologies to Improve Data Quality

Subscribe to our newsletter

Data quality has become an imperative of data management, and while it is often difficult to achieve, improvements can be made to meet both business and regulatory requirements by implementing data governance policies as well as big data and artificial intelligence technologies.

Andrew Delaney, chief content officer at A-Team Group, led discussion on data quality at A-Team’s recent Data Management Summit in London. He was joined by Colin Gibson, group architecture director at Willis Group; Neville Homer, head of RWA reference data and data services at RBS; Ewan McAllan, EMEA head of data quality at HSBC; and Bob Mudhar, associate partner at Citihub Consulting.

Delaney started the discussion with questions about why data quality is important and what it is. Gibson answered: “Data quality is a meaningful outcome of data management and it is a top priority for many people. It is whatever you need it to be. If data is a problem, there is a data quality issue.” On why data quality is important, Mudhar added: “Poor data has to be fixed manually, which is expensive. Getting data right frees up money and people.” On the question of whether quality is being driven by regulators or industry participants, Homer commented: “The focus on data quality is getting broader, but the regulatory stick of BCBS 239 is driving change.”

Large volumes of data make quality improvements complex, but there are solutions. Gibson explained: “Be specific, a broad statement won’t support progress. Understand the pain point, focus on data that you care about and then tackle the issue of whether it is complete, correct and consistent.” McAllan added: “Data quality improvement initiatives must be applied across an organisation and must drive business benefits.”

To achieve improvement, Mudhar first identified three obstacles to data quality including securing investment, eliminating dirty data and working with others to agree differences in data. Then, he advised: “Work with operational teams to figure out what you are spending the most time on and use tools and techniques that may not fix all the data, but will allow it to be used for something valuable. Also, look at data sources and how many golden copies of data you have across the business.”

If data quality is an imperative of reference data management, Delaney asked how data managers can know when they have hit targets. McAllan said: “If the business can fulfil what it wants to achieve, data quality has reached its target.” On the role of data governance, Homer said: “Governance helps to manage data effectively. The focus of governance should be on priorities, outcomes and raising awareness of data quality. The danger is death by governance, so the right balance is needed.” Gibson added: “The upside of governance is clarity. Knowing who sets targets and who to shout at when things go wrong.”

Considering whether shared data management services can improve data quality, panel members suggested shared services could reduce data costs, but would not necessarily improve data quality as improvement depends on data sources and controls. Big data and artificial intelligence technologies found more favour, with panel members noting their capacity and capability to manage huge amounts of data and deliver high quality data quickly.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to simplify and modernize data architecture to unleash data value and innovation

The data needs of financial institutions are growing at pace as new formats and greater volumes of information are integrated into their systems. With this has come greater complexity in managing and governing that data, amplifying pain points along data pipelines. In response, innovative new streamlined and flexible architectures have emerged that can absorb and...

BLOG

Bloomberg Debuts Real-Time Events Data Feed

Bloomberg has broken new ground with the release of its Real-time Events Data solution, which it says will help financial institutions make better decisions faster, based on the most accurate and timely information. The US financial data and technology behemoth has leveraged its real-time streaming API connectivity to provide subscribing clients with data from earnings...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...