About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

New York Data Management Summit: Data Governance for Improved Data Quality

Subscribe to our newsletter

Regulatory requirements and internal needs for improved data quality are pushing data governance up the agenda at many financial institutions. In response, data governance policies are being implemented and firms are considering tools and techniques that could support and sustain better data quality.

Approaches to data governance were discussed at A-Team Group’s recent New York Data Management Summit during a session entitled ‘Data Governance for Improved Data Quality’. A-Team chief content officer, Andrew Delaney, led the discussion and was joined by Randall Gordon, head of data governance at Moody’s Corporation; Thomas Mavroudis, Americas regional data manager at HSBC Technology and Services; and John Yelle, vice president of data services at DTCC.

Starting with the question of what data governance is and why it is important, Delaney turned to Yelle, who responded: “There are many drivers behind data governance including regulation, which makes it important to achieve consistency across data. We are making progress on this, but data governance is foundational to delivery.” Gordon said forthcoming Basel regulations with data management and data quality requirements, as well as regulators’ rights to scrutinise data that firms disclose are encouraging a focus on data governance.

If sound data governance is a must-have, finding the balance between too much and too little is a key task. Mavroudis said: “Without effective data governance, you cannot have a data management strategy. The need is to cover the right amount of data, typically data that represents attributes of critical business processes.” Yelle commented on the need for policies, standards and rules to support data governance, but also the requirement for cultural change to understand its importance.

Looking at the possibility of standardisation to improve data quality, Yelle described the Enterprise Data Management Council’s work on a Data Quality Index that provides a standardised approach to reporting on data quality across different data domains. He said: “The index provides snapshots of data quality that can support comparisons of data, perhaps at different times or across different datasets. It includes metrics that allow anyone responsible for the execution of processes around data to see the data quality and whether it is improving. The index can also be correlated against other things happening in a firm, such as a reduction in operational risk incidents.”

Beyond standardisation, Delaney questioned whether it is possible to operationalise data quality. Gordon replied: “Accountability for data and definitions of accountability are important here. Setting these up is a good first step, then it is helpful to work though the discipline of identifying critical data elements from a business perspective and improving their quality.” Yelle added: “Some would argue that data quality is already operationalised. The need is to formalise accountability and then drive change.”

Turning to the practical issues of implementing data governance and getting buy-in, the panel members agreed that spreading the word and gaining credibility are key to success. In terms of practical plans, Mavroudis said: “Operational teams need to make a current state analysis and decide where they want to get to. Doing this, they will discover data issues and issues that have become themes. They can then prioritise requirements and build a roadmap. You never arrive at where you want to get to, so the operational cycle is continuous to make data fit for purpose.”

Acknowledging this never-ending operational cycle, Gordon concluded: “You know you have achieved quite a bit when you go into a meeting about processes or pain points and the first thing people bring up is no longer data.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Hearing from the Experts: AI Governance Best Practices

9 September 2025 10:00am ET | 3:00pm London | 4:00pm CET Duration: 50 Minutes The rapid spread of artificial intelligence in the financial industry presents data teams with novel challenges. AI’s ability to harvest and utilize vast amounts of data has raised concerns about the privacy and security of sensitive proprietary data and the ethical...

BLOG

Navigating Divergent AI Regulation – Can Standards Bring Clarity?

Artificial intelligence is transforming financial services, from automating credit assessments to streamlining compliance processes. But while AI capabilities are developing at pace, regulatory frameworks are struggling to keep up. Nowhere is this more apparent than in the contrasting approaches taken by the European Union and the United Kingdom. The EU has opted for a rules-heavy,...

EVENT

RegTech Summit New York

Now in its 9th year, the RegTech Summit in New York will bring together the RegTech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...