Improving and sustaining data quality has become essential to meeting business and regulatory compliance requirements across capital markets, but challenges remain with firms facing problems raised by data silos, disparate data sources, large data volumes, lack of standardisation and a poor understanding of data quality across the organisation.
Addressing these issues, a recent A-Team Group webinar looked at approaches to data quality, taking into account challenges, best practices, supporting technologies, metrics, and rules and standards. The webinar was hosted by A-Team editor Sarah Underwood and joined by Sue Geuens, data standards and best practice adoption at Barclays, and president of the global data management community DAMA International; Matthew Rawlings, head of middle office and operations at Bloomberg; and Dominique Tanner, head of business development at SIX Financial Information.
An audience poll questioning firms’ progress on implementing data quality set the scene for discussion with 42% of respondents saying they are implementing a data quality programme, 21% maintaining data quality as part of business as usual, 16% having implemented a data quality programme, 12% planning to implement a programme and just 9% without a data quality plan or programme.
Looking at what we mean by data quality, the panel noted that data should be timely, correct, complete and consistent, and that people with data ownership should continually ensure the data is fit for purpose, identify glitches and make fixes without recourse to IT.
The drivers behind improving data quality include, as in so many cases, regulatory compliance, but also the need for high quality data in the front office to support decision making and the ability to cut costs by identifying quality issues and remediating them.
The challenges of achieving a desired level of quality can be significant, but best practice approaches and emerging technologies can help. The panel pointed to a process covering data governance, quality and management as a means of moving towards success, and advised firms not to boil the ocean, but rather choose and clean data that is important to the business. Technology tools touched on include automation and cognitive processing.
Data quality metrics were also a matter of some discussion, with panel members noting the need to embed metrics in the data quality process so that patterns can be seen and fixes prioritised. The outcome of data quality programmes? According to our audience, significant business and operational benefits.
To find out more about:
- Definitions of data quality
- Drivers of improvement
- Challenges of data quality
- Best practice implementation
- Metrics to manage quality
- Business and operational benefits
Listen to the webinar here.