About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Summit: Techniques and Technologies to Improve Data Quality

Subscribe to our newsletter

Data quality has become an imperative of data management, and while it is often difficult to achieve, improvements can be made to meet both business and regulatory requirements by implementing data governance policies as well as big data and artificial intelligence technologies.

Andrew Delaney, chief content officer at A-Team Group, led discussion on data quality at A-Team’s recent Data Management Summit in London. He was joined by Colin Gibson, group architecture director at Willis Group; Neville Homer, head of RWA reference data and data services at RBS; Ewan McAllan, EMEA head of data quality at HSBC; and Bob Mudhar, associate partner at Citihub Consulting.

Delaney started the discussion with questions about why data quality is important and what it is. Gibson answered: “Data quality is a meaningful outcome of data management and it is a top priority for many people. It is whatever you need it to be. If data is a problem, there is a data quality issue.” On why data quality is important, Mudhar added: “Poor data has to be fixed manually, which is expensive. Getting data right frees up money and people.” On the question of whether quality is being driven by regulators or industry participants, Homer commented: “The focus on data quality is getting broader, but the regulatory stick of BCBS 239 is driving change.”

Large volumes of data make quality improvements complex, but there are solutions. Gibson explained: “Be specific, a broad statement won’t support progress. Understand the pain point, focus on data that you care about and then tackle the issue of whether it is complete, correct and consistent.” McAllan added: “Data quality improvement initiatives must be applied across an organisation and must drive business benefits.”

To achieve improvement, Mudhar first identified three obstacles to data quality including securing investment, eliminating dirty data and working with others to agree differences in data. Then, he advised: “Work with operational teams to figure out what you are spending the most time on and use tools and techniques that may not fix all the data, but will allow it to be used for something valuable. Also, look at data sources and how many golden copies of data you have across the business.”

If data quality is an imperative of reference data management, Delaney asked how data managers can know when they have hit targets. McAllan said: “If the business can fulfil what it wants to achieve, data quality has reached its target.” On the role of data governance, Homer said: “Governance helps to manage data effectively. The focus of governance should be on priorities, outcomes and raising awareness of data quality. The danger is death by governance, so the right balance is needed.” Gibson added: “The upside of governance is clarity. Knowing who sets targets and who to shout at when things go wrong.”

Considering whether shared data management services can improve data quality, panel members suggested shared services could reduce data costs, but would not necessarily improve data quality as improvement depends on data sources and controls. Big data and artificial intelligence technologies found more favour, with panel members noting their capacity and capability to manage huge amounts of data and deliver high quality data quickly.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and reduce costs. Use cases of specific importance to the finance sector, such as data...

BLOG

New Data Partnership Approach Urged for Investors in SimCorp Report

Investment managers must take a fresh approach to data management, stressing trusted partnerships with outside expertise over traditional outsourcing models, as they seek to adapt to a rapidly changing economic landscape, a report has urged. The binary build-versus-buy strategy that has been the basis of innovation adoption for decades has been upended by advances in...

EVENT

AI in Capital Markets Summit London

Now in its 3rd year, the AI in Capital Markets Summit returns with a focus on the practicalities of onboarding AI enterprise wide for business value creation. Whilst AI offers huge potential to revolutionise capital markets operations many are struggling to move beyond pilot phase to generate substantial value from AI.

GUIDE

Trading Regulations Handbook 2021

In these unprecedented times, a carefully crafted trading infrastructure is crucial for capital markets participants. Yet, the impact of trading regulations on infrastructure can be difficult to manage. The Trading Regulations Handbook 2021 can help. It provides all the essentials you need to know about regulations impacting trading operations, data and technology. A-Team Group’s Trading...