About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit: Techniques and Technologies to Improve Data Quality

Subscribe to our newsletter

Data quality has become an imperative of data management, and while it is often difficult to achieve, improvements can be made to meet both business and regulatory requirements by implementing data governance policies as well as big data and artificial intelligence technologies.

Andrew Delaney, chief content officer at A-Team Group, led discussion on data quality at A-Team’s recent Data Management Summit in London. He was joined by Colin Gibson, group architecture director at Willis Group; Neville Homer, head of RWA reference data and data services at RBS; Ewan McAllan, EMEA head of data quality at HSBC; and Bob Mudhar, associate partner at Citihub Consulting.

Delaney started the discussion with questions about why data quality is important and what it is. Gibson answered: “Data quality is a meaningful outcome of data management and it is a top priority for many people. It is whatever you need it to be. If data is a problem, there is a data quality issue.” On why data quality is important, Mudhar added: “Poor data has to be fixed manually, which is expensive. Getting data right frees up money and people.” On the question of whether quality is being driven by regulators or industry participants, Homer commented: “The focus on data quality is getting broader, but the regulatory stick of BCBS 239 is driving change.”

Large volumes of data make quality improvements complex, but there are solutions. Gibson explained: “Be specific, a broad statement won’t support progress. Understand the pain point, focus on data that you care about and then tackle the issue of whether it is complete, correct and consistent.” McAllan added: “Data quality improvement initiatives must be applied across an organisation and must drive business benefits.”

To achieve improvement, Mudhar first identified three obstacles to data quality including securing investment, eliminating dirty data and working with others to agree differences in data. Then, he advised: “Work with operational teams to figure out what you are spending the most time on and use tools and techniques that may not fix all the data, but will allow it to be used for something valuable. Also, look at data sources and how many golden copies of data you have across the business.”

If data quality is an imperative of reference data management, Delaney asked how data managers can know when they have hit targets. McAllan said: “If the business can fulfil what it wants to achieve, data quality has reached its target.” On the role of data governance, Homer said: “Governance helps to manage data effectively. The focus of governance should be on priorities, outcomes and raising awareness of data quality. The danger is death by governance, so the right balance is needed.” Gibson added: “The upside of governance is clarity. Knowing who sets targets and who to shout at when things go wrong.”

Considering whether shared data management services can improve data quality, panel members suggested shared services could reduce data costs, but would not necessarily improve data quality as improvement depends on data sources and controls. Big data and artificial intelligence technologies found more favour, with panel members noting their capacity and capability to manage huge amounts of data and deliver high quality data quickly.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking Transparency in Private Markets: Data-Driven Strategies in Asset Management

As asset managers continue to increase their allocations in private assets, the demand for greater transparency, risk oversight, and operational efficiency is growing rapidly. Managing private markets data presents its own set of unique challenges due to a lack of transparency, disparate sources and lack of standardization. Without reliable access, your firm may face inefficiencies,...

BLOG

Twelve Leading Data Lineage Solutions for Capital Markets

The ability to trace the journey of data from its origin to its final report is no longer a luxury but a regulatory and operational necessity. As firms grapple with the intensifying requirements of regulations such as BCBS 239, GDPR and the shifting landscape of MiFID II, the “black box” approach to data management has...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2021/2022 – Ninth Edition

Welcome to the ninth edition of A-Team Group’s Regulatory Data Handbook, a publication dedicated to helping you gain a full understanding of regulations related to your organisation from the details of requirements to best practice implementation. This edition of the handbook includes a focus on regulations being rolled out to bring order and standardisation to...