About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

New York Data Management Summit: Data Governance for Improved Data Quality

Subscribe to our newsletter

Regulatory requirements and internal needs for improved data quality are pushing data governance up the agenda at many financial institutions. In response, data governance policies are being implemented and firms are considering tools and techniques that could support and sustain better data quality.

Approaches to data governance were discussed at A-Team Group’s recent New York Data Management Summit during a session entitled ‘Data Governance for Improved Data Quality’. A-Team chief content officer, Andrew Delaney, led the discussion and was joined by Randall Gordon, head of data governance at Moody’s Corporation; Thomas Mavroudis, Americas regional data manager at HSBC Technology and Services; and John Yelle, vice president of data services at DTCC.

Starting with the question of what data governance is and why it is important, Delaney turned to Yelle, who responded: “There are many drivers behind data governance including regulation, which makes it important to achieve consistency across data. We are making progress on this, but data governance is foundational to delivery.” Gordon said forthcoming Basel regulations with data management and data quality requirements, as well as regulators’ rights to scrutinise data that firms disclose are encouraging a focus on data governance.

If sound data governance is a must-have, finding the balance between too much and too little is a key task. Mavroudis said: “Without effective data governance, you cannot have a data management strategy. The need is to cover the right amount of data, typically data that represents attributes of critical business processes.” Yelle commented on the need for policies, standards and rules to support data governance, but also the requirement for cultural change to understand its importance.

Looking at the possibility of standardisation to improve data quality, Yelle described the Enterprise Data Management Council’s work on a Data Quality Index that provides a standardised approach to reporting on data quality across different data domains. He said: “The index provides snapshots of data quality that can support comparisons of data, perhaps at different times or across different datasets. It includes metrics that allow anyone responsible for the execution of processes around data to see the data quality and whether it is improving. The index can also be correlated against other things happening in a firm, such as a reduction in operational risk incidents.”

Beyond standardisation, Delaney questioned whether it is possible to operationalise data quality. Gordon replied: “Accountability for data and definitions of accountability are important here. Setting these up is a good first step, then it is helpful to work though the discipline of identifying critical data elements from a business perspective and improving their quality.” Yelle added: “Some would argue that data quality is already operationalised. The need is to formalise accountability and then drive change.”

Turning to the practical issues of implementing data governance and getting buy-in, the panel members agreed that spreading the word and gaining credibility are key to success. In terms of practical plans, Mavroudis said: “Operational teams need to make a current state analysis and decide where they want to get to. Doing this, they will discover data issues and issues that have become themes. They can then prioritise requirements and build a roadmap. You never arrive at where you want to get to, so the operational cycle is continuous to make data fit for purpose.”

Acknowledging this never-ending operational cycle, Gordon concluded: “You know you have achieved quite a bit when you go into a meeting about processes or pain points and the first thing people bring up is no longer data.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Navigating a Complex World: Best Data Practices in Sanctions Screening

5 November 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes As rising geopolitical uncertainty prompts an intensification in the complexity and volume of global economic and financial sanctions, banks and financial institutions are faced with a daunting set of new compliance challenges. The risk of inadvertently engaging with sanctioned securities has...

BLOG

Tracing Data’s Transformation is Key to Compliance and AI Effectiveness: Webinar Preview

Transparency and accuracy are characteristics of data that are equally important for financial institutions’ compliance processes and the rollout of artificial intelligence applications. Without those qualities, regulators will have little trust in the disclosures of firms’ compliance teams and any AI technology will be prone to misleading and potentially damaging outputs. To ensure these two...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...