The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit: Techniques and Technologies to Improve Data Quality

Data quality has become an imperative of data management, and while it is often difficult to achieve, improvements can be made to meet both business and regulatory requirements by implementing data governance policies as well as big data and artificial intelligence technologies.

Andrew Delaney, chief content officer at A-Team Group, led discussion on data quality at A-Team’s recent Data Management Summit in London. He was joined by Colin Gibson, group architecture director at Willis Group; Neville Homer, head of RWA reference data and data services at RBS; Ewan McAllan, EMEA head of data quality at HSBC; and Bob Mudhar, associate partner at Citihub Consulting.

Delaney started the discussion with questions about why data quality is important and what it is. Gibson answered: “Data quality is a meaningful outcome of data management and it is a top priority for many people. It is whatever you need it to be. If data is a problem, there is a data quality issue.” On why data quality is important, Mudhar added: “Poor data has to be fixed manually, which is expensive. Getting data right frees up money and people.” On the question of whether quality is being driven by regulators or industry participants, Homer commented: “The focus on data quality is getting broader, but the regulatory stick of BCBS 239 is driving change.”

Large volumes of data make quality improvements complex, but there are solutions. Gibson explained: “Be specific, a broad statement won’t support progress. Understand the pain point, focus on data that you care about and then tackle the issue of whether it is complete, correct and consistent.” McAllan added: “Data quality improvement initiatives must be applied across an organisation and must drive business benefits.”

To achieve improvement, Mudhar first identified three obstacles to data quality including securing investment, eliminating dirty data and working with others to agree differences in data. Then, he advised: “Work with operational teams to figure out what you are spending the most time on and use tools and techniques that may not fix all the data, but will allow it to be used for something valuable. Also, look at data sources and how many golden copies of data you have across the business.”

If data quality is an imperative of reference data management, Delaney asked how data managers can know when they have hit targets. McAllan said: “If the business can fulfil what it wants to achieve, data quality has reached its target.” On the role of data governance, Homer said: “Governance helps to manage data effectively. The focus of governance should be on priorities, outcomes and raising awareness of data quality. The danger is death by governance, so the right balance is needed.” Gibson added: “The upside of governance is clarity. Knowing who sets targets and who to shout at when things go wrong.”

Considering whether shared data management services can improve data quality, panel members suggested shared services could reduce data costs, but would not necessarily improve data quality as improvement depends on data sources and controls. Big data and artificial intelligence technologies found more favour, with panel members noting their capacity and capability to manage huge amounts of data and deliver high quality data quickly.

Related content

WEBINAR

Recorded Webinar: S&P Recorded Webinar: Confidence and Clarity – Understanding Valuation Risk

This recorded webinar will help you to understand how valuations and pricing have evolved over the last couple of years in the post-crisis environment. In this session, you will hear different perspectives on how market participants have been tackling the valuation risk challenge from Paul Sharkey of Northern Trust, James Dimech Debono of KPMG, and...

BLOG

Alveo Integrates ULTUMUS Index and ETF Managed Data Service with Prime

Alveo and ULTUMUS, an exchange-traded fund (ETF) specialist, have partnered to bring together Alveo’s data mastering solution, Prime, and ULTUMUS’s global ETF and index managed data service. The partnership aims to enable Alveo customers to integrate index and ETF information more quickly, and allow ULTUMUS’s clients to enhance their data mastering and data integration capabilities....

EVENT

RegTech Summit London

Now in its 6th year, the RegTech Summit in London explores how the European financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Enterprise Data Management Europe 2010

he US may seem to be ahead of the rest of the world in terms of championing the data management cause with the inclusion of reference data focused items in the Dodd-Frank Act, but Europe is not too far behind. Senior European level officials such as European Central Bank (ECB) president Jean-Claude Trichet have taken...