About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

New York Data Management Summit: Data Governance for Improved Data Quality

Subscribe to our newsletter

Regulatory requirements and internal needs for improved data quality are pushing data governance up the agenda at many financial institutions. In response, data governance policies are being implemented and firms are considering tools and techniques that could support and sustain better data quality.

Approaches to data governance were discussed at A-Team Group’s recent New York Data Management Summit during a session entitled ‘Data Governance for Improved Data Quality’. A-Team chief content officer, Andrew Delaney, led the discussion and was joined by Randall Gordon, head of data governance at Moody’s Corporation; Thomas Mavroudis, Americas regional data manager at HSBC Technology and Services; and John Yelle, vice president of data services at DTCC.

Starting with the question of what data governance is and why it is important, Delaney turned to Yelle, who responded: “There are many drivers behind data governance including regulation, which makes it important to achieve consistency across data. We are making progress on this, but data governance is foundational to delivery.” Gordon said forthcoming Basel regulations with data management and data quality requirements, as well as regulators’ rights to scrutinise data that firms disclose are encouraging a focus on data governance.

If sound data governance is a must-have, finding the balance between too much and too little is a key task. Mavroudis said: “Without effective data governance, you cannot have a data management strategy. The need is to cover the right amount of data, typically data that represents attributes of critical business processes.” Yelle commented on the need for policies, standards and rules to support data governance, but also the requirement for cultural change to understand its importance.

Looking at the possibility of standardisation to improve data quality, Yelle described the Enterprise Data Management Council’s work on a Data Quality Index that provides a standardised approach to reporting on data quality across different data domains. He said: “The index provides snapshots of data quality that can support comparisons of data, perhaps at different times or across different datasets. It includes metrics that allow anyone responsible for the execution of processes around data to see the data quality and whether it is improving. The index can also be correlated against other things happening in a firm, such as a reduction in operational risk incidents.”

Beyond standardisation, Delaney questioned whether it is possible to operationalise data quality. Gordon replied: “Accountability for data and definitions of accountability are important here. Setting these up is a good first step, then it is helpful to work though the discipline of identifying critical data elements from a business perspective and improving their quality.” Yelle added: “Some would argue that data quality is already operationalised. The need is to formalise accountability and then drive change.”

Turning to the practical issues of implementing data governance and getting buy-in, the panel members agreed that spreading the word and gaining credibility are key to success. In terms of practical plans, Mavroudis said: “Operational teams need to make a current state analysis and decide where they want to get to. Doing this, they will discover data issues and issues that have become themes. They can then prioritise requirements and build a roadmap. You never arrive at where you want to get to, so the operational cycle is continuous to make data fit for purpose.”

Acknowledging this never-ending operational cycle, Gordon concluded: “You know you have achieved quite a bit when you go into a meeting about processes or pain points and the first thing people bring up is no longer data.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: FINRA CAT CAIS: What to Expect – Giving Regulators Full Access to Your Customer & Account Data

Join n-Tier and a panel of industry experts to discuss implications of the SEC’s Consolidated Audit Trail (CAT) Customer & Account Information System (CAIS) Phase 2e. The initial phase of CAIS was the start of a new era for broker-dealer Onboarding and Account Management teams, turning customer and account reference data into a daily regulatory...

BLOG

GoldenSource Plans Product Innovation Following Acquisition by Gemspring Capital

GoldenSource is planning to innovate its enterprise data management (EDM) and master data management (MDM) solutions, and accelerate global growth following its acquisition by Gemspring Capital. Product innovation will include building and buying adjacent solutions to the EDM platform. GoldenSource CEO John Eley, describes the acquisition as a strategic investment in the company’s growth. He...

EVENT

Data Management Summit Europe Virtual (Redirected)

The Data Management Summit Europe Virtual brings together the European data management community to explore the latest challenges, opportunities and data innovations facing sell side and buy side financial institutions.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...