The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Experts Discuss the Dilemmas of Data Quality

Data quality has become an imperative for financial institutions as they face increasing regulation and look to data for business benefits and opportunities – but it is not always easy to achieve and requires significant investment in time and resources.

For many institutions, a definition of data quality is based on some or all of the data characteristics set out in regulation BCBS 239 and including accuracy and integrity, completeness and timeliness. Defining data quality can be a good start to improvement projects, but how good should data quality be, how can it be measured and demonstrated, and how can data quality be geared to different business processes?

These are just some of the issues that will be discussed during a panel session on data quality at next week’s A-Team Group Data Management Summit in London.

Fiona Grierson, enterprise data strategy manager at Clydesdale Bank and a member of the panel, has been developing data quality at the bank for about three years. The bank defines data quality as data that is complete, appropriate and accurate, and uses the Enterprise Data Management Council’s Data Management Maturity Model to score data quality and drive improvement. It also has a data management framework for projects to ensure they are implemented using best practice around data quality.

Grierson explains: “We look at the business case for particular strategies and consider the data quality requirement. For example, we look at regulations and the extent of their data quality requirements and at customer initiatives and their need for data quality to ensure seamless customer service.”

Grierson will be joined on the data quality panel by practitioners including Jon Deighton, head of global efficiency and strategy for UK data management at BNP Paribas Securities Services; James Longstaff, vice president, chief data office, at Deutsche Bank; and Neville Homer, head of RWA reference data, regulatory reporting, at RBS.

To find out more about:

  • Regulations driving data quality
  • Approaches to improvement
  • Data quality metrics
  • Technology solutions
  • Practitioner experience

Register for next week’s A-Team Group Data Management Summit in London.

Related content

WEBINAR

Upcoming Webinar: Improving data integrity to address regulatory requirements

Date: 6 May 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Financial institutions today face a global regulatory landscape characterised by rigorous and varied reporting requirements across their businesses. Reporting challenges include completing more data fields across more lines of business with greater frequency, adding complexity and cost. At the...

BLOG

GLEIF Details Technologies Underlying Digital Version of LEI, the Verifiable LEI

The Global Legal Entity Identifier Foundation (GLEIF) has published issuance and technical infrastructure models for the verifiable LEI (vLEI) system it introduced back in December 2020. The vLEI is a secure digital attestation of a conventional LEI and is designed to extend the use of the identifier and, ultimately, enable instant and automated identity verification...

EVENT

TradingTech Summit Virtual

Trading Tech Summit (TTS) Virtual will look at how trading technology operations can capitalise on recent disruption and leverage technology to find efficiencies in the new normal environment. The crisis has highlighted that the future is digital and cloud based, and the ability to innovate faster and at scale has become critical. As we move into recovery and ‘business as usual’, what changes and technology innovations should the industry adopt to simplify operations and to support speed, agility and flexibility in trading operations.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...