The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Management Experts Discuss the Dilemmas of Data Quality

Data quality has become an imperative for financial institutions as they face increasing regulation and look to data for business benefits and opportunities – but it is not always easy to achieve and requires significant investment in time and resources.

For many institutions, a definition of data quality is based on some or all of the data characteristics set out in regulation BCBS 239 and including accuracy and integrity, completeness and timeliness. Defining data quality can be a good start to improvement projects, but how good should data quality be, how can it be measured and demonstrated, and how can data quality be geared to different business processes?

These are just some of the issues that will be discussed during a panel session on data quality at next week’s A-Team Group Data Management Summit in London.

Fiona Grierson, enterprise data strategy manager at Clydesdale Bank and a member of the panel, has been developing data quality at the bank for about three years. The bank defines data quality as data that is complete, appropriate and accurate, and uses the Enterprise Data Management Council’s Data Management Maturity Model to score data quality and drive improvement. It also has a data management framework for projects to ensure they are implemented using best practice around data quality.

Grierson explains: “We look at the business case for particular strategies and consider the data quality requirement. For example, we look at regulations and the extent of their data quality requirements and at customer initiatives and their need for data quality to ensure seamless customer service.”

Grierson will be joined on the data quality panel by practitioners including Jon Deighton, head of global efficiency and strategy for UK data management at BNP Paribas Securities Services; James Longstaff, vice president, chief data office, at Deutsche Bank; and Neville Homer, head of RWA reference data, regulatory reporting, at RBS.

To find out more about:

  • Regulations driving data quality
  • Approaches to improvement
  • Data quality metrics
  • Technology solutions
  • Practitioner experience

Register for next week’s A-Team Group Data Management Summit in London.

Related content

WEBINAR

Upcoming Webinar: The post-Brexit UK sanctions regime – how to stay safe and compliant

Date: 11 March 2021 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes When the Brexit transition period came to an end on 31 December 2020, a new sanctions regime was introduced in the UK under legislation set out in the Sanctions and Anti-Money Laundering Act 2018 (aka the Sanctions Act). The...

BLOG

Are the Complexities of the New IFR Regime to Investment Firms as Kryptonite is to Superman?

By Richard Moss, Global Product Manager, Capital, AxiomSL. Even though investment firms have very different primary business models and risk profiles as opposed to lending institutions, to date, for regulatory purposes many have been considered credit institutions and accordingly report under the Basel-driven capital requirements regulation (CRR). The CRR approach is too broad to effectively...

EVENT

RegTech Summit New York City

Now in its 5th year, the RegTech Summit in NYC explores how the North American financial services industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

RegTech Suppliers Guide 2020/2021

Welcome to the second edition of A-Team Group’s RegTech Suppliers Guide, an essential aid for financial institutions sourcing innovative solutions to improve their regulatory response, and a showcase for encumbent and new RegTech vendors with offerings designed to match market demand. Available free of charge and based on an industry-wide survey, the guide provides a...