About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

New York Data Management Summit: Data Governance for Improved Data Quality

Subscribe to our newsletter

Regulatory requirements and internal needs for improved data quality are pushing data governance up the agenda at many financial institutions. In response, data governance policies are being implemented and firms are considering tools and techniques that could support and sustain better data quality.

Approaches to data governance were discussed at A-Team Group’s recent New York Data Management Summit during a session entitled ‘Data Governance for Improved Data Quality’. A-Team chief content officer, Andrew Delaney, led the discussion and was joined by Randall Gordon, head of data governance at Moody’s Corporation; Thomas Mavroudis, Americas regional data manager at HSBC Technology and Services; and John Yelle, vice president of data services at DTCC.

Starting with the question of what data governance is and why it is important, Delaney turned to Yelle, who responded: “There are many drivers behind data governance including regulation, which makes it important to achieve consistency across data. We are making progress on this, but data governance is foundational to delivery.” Gordon said forthcoming Basel regulations with data management and data quality requirements, as well as regulators’ rights to scrutinise data that firms disclose are encouraging a focus on data governance.

If sound data governance is a must-have, finding the balance between too much and too little is a key task. Mavroudis said: “Without effective data governance, you cannot have a data management strategy. The need is to cover the right amount of data, typically data that represents attributes of critical business processes.” Yelle commented on the need for policies, standards and rules to support data governance, but also the requirement for cultural change to understand its importance.

Looking at the possibility of standardisation to improve data quality, Yelle described the Enterprise Data Management Council’s work on a Data Quality Index that provides a standardised approach to reporting on data quality across different data domains. He said: “The index provides snapshots of data quality that can support comparisons of data, perhaps at different times or across different datasets. It includes metrics that allow anyone responsible for the execution of processes around data to see the data quality and whether it is improving. The index can also be correlated against other things happening in a firm, such as a reduction in operational risk incidents.”

Beyond standardisation, Delaney questioned whether it is possible to operationalise data quality. Gordon replied: “Accountability for data and definitions of accountability are important here. Setting these up is a good first step, then it is helpful to work though the discipline of identifying critical data elements from a business perspective and improving their quality.” Yelle added: “Some would argue that data quality is already operationalised. The need is to formalise accountability and then drive change.”

Turning to the practical issues of implementing data governance and getting buy-in, the panel members agreed that spreading the word and gaining credibility are key to success. In terms of practical plans, Mavroudis said: “Operational teams need to make a current state analysis and decide where they want to get to. Doing this, they will discover data issues and issues that have become themes. They can then prioritise requirements and build a roadmap. You never arrive at where you want to get to, so the operational cycle is continuous to make data fit for purpose.”

Acknowledging this never-ending operational cycle, Gordon concluded: “You know you have achieved quite a bit when you go into a meeting about processes or pain points and the first thing people bring up is no longer data.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to simplify and modernize data architecture to unleash data value and innovation

The data needs of financial institutions are growing at pace as new formats and greater volumes of information are integrated into their systems. With this has come greater complexity in managing and governing that data, amplifying pain points along data pipelines. In response, innovative new streamlined and flexible architectures have emerged that can absorb and...

BLOG

From AI Sprint to AI Live Testing – FCA Leads the Way

The UK Financial Conduct Authority (FCA) recently published its Proposal For AI Live Testing engagement paper outlining a 12-month pilot inside the regulator’s AI Lab. This is designed to enable firms with production-ready AI models to trial them in controlled live-market conditions, under close FCA technical and supervisory support, exploring output-driven validation, model-risk safeguards and...

EVENT

TradingTech Summit New York

Our TradingTech Briefing in New York is aimed at senior-level decision makers in trading technology, electronic execution, trading architecture and offers a day packed with insight from practitioners and from innovative suppliers happy to share their experiences in dealing with the enterprise challenges facing our marketplace.

GUIDE

The DORA Implementation Playbook: A Practitioner’s Guide to Demonstrating Resilience Beyond the Deadline

The Digital Operational Resilience Act (DORA) has fundamentally reshaped the European Union’s financial regulatory landscape, with its full application beginning on January 17, 2025. This regulation goes beyond traditional risk management, explicitly acknowledging that digital incidents can threaten the stability of the entire financial system. As the deadline has passed, the focus is now shifting...