About a-team Marketing Services
The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

A-Team Data Management Summit Outlines Elements of Optimal Data Management

Subscribe to our newsletter

Organising for optimal data management, transformation challenges, operational models, data quality, emerging technologies, data governance and, of course, regulation were top agenda items at A-Team Group’s recent Data Management Summit in New York City.

The first keynote speaker, David Luria, vice president, front office data intelligence programs at T. Rowe Price, set the scene for the day with a presentation entitled Un-Ready, Un-Willing and Un-Able: Why Most Data Projects Fail and How to Win Instead. Luria said most projects fail to deliver on their promise because they address the wrong problem, use the wrong resources or take the wrong approach. On the upside, he suggested successful data projects are built on checklists, people capabilities and communication, with checklists including enterprise objectives, programme strategy and scope.

With these issues in mind, the conference turned its attention to a panel session on organising for optimal data management. Chris Vickery, global head of enterprise data management at Nomura, noted the post-crisis focus on regulation and internal risk, and the subsequent push to centralise and govern data, a move that raises questions about where data governance fits into an organisation and how decisions are made.

Answering these questions, panel members said data governance should be a business function that drives value out of data assets, while big decisions on projects such as BCBS 239 compliance should be made at the top of the house. Contrary to some opinion, the panel agreed that regulation can help firms develop best practice and that BCBS 239 in particular is helping banks move towards optimal data management.

Reiterating the need for senior management engagement in big decisions and projects, a panel covering transformational challenges and new operational models discussed the trade off between strategic and tactical projects, and noted the continual need to convey the value delivered by data management to the business.

Considering the strategic versus tactical approach, Tom Wise, head of enterprise data management at GMO, described a strategic decision framework including four tests that is used by the firm to decide if a project is strategic. The tests question whether a proposed project is regulatory, will create a new app, will impact master data sources or is a convergence of numerous requests. If the project fits into one of these categories it becomes strategic.

A panel session looking specifically at KYC and client onboarding noted that both are becoming more difficult as regulatory data requirements continue to escalate. With industry past initial implementation, which poured resources into the problem, automation of KYC and onboarding processes has become essential, although as Bill Hauserman, senior director of compliance solutions at Bureau van Dijk, said, the intersection of technology and data can only be successful when data quality is high. Touching on the utility response to KYC, Matt Stauffer, CEO at DTCC Clarient, said adoption of the utility model has accelerated over recent months.

With change driven in great part by regulation, a panel of experts identified regulations that are proving particularly difficult to implement. Among them were Solvency II, Markets in Financial Instruments Directive II (MiFID II), Market Abuse Regulation (MAR) and the Fundamental Review of the Trading Book (FRTB), all of which are very demanding in terms of data and add complexity to data management.

BCBS 239 was also in the frame, with John Fleming, head of enterprise data governance at BNY Mellon, describing how the bank has implemented a data governance framework and data distribution hub to ensure compliance.

Spanning change, organisation and regulation, data quality and emerging technologies were subjects of robust discussion during panel sessions. The importance of data quality was emphasised time and time again, data quality initiatives were described, quality metrics considered and potential solutions such as data utilities, machine learning and big data management techniques examined.

A panel dedicated to emerging technologies and fintech focused on blockchain and found it to have significant potential, but only if data management is under control and security issues are resolved. Marc Alvarez, chief data officer at Mizuho, concluded: “Blockchain is just a better way to do what we do better. It may cause disruption, but it is not revolutionary, and a demo of a positive business model could cause a stampede.”

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Data standards and global identifiers update

Date: 21 June 2022 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data standards and global identifiers are the international language of capital markets – but how widely have they been adopted, how useful are they in practice, and can they stand the test of sustaining stable markets? This webinar will...

BLOG

IPOhub Extends Provision of SME IPO Data Through Partnership with EOSE and TickSmith

Market data specialist EOSE has partnered IPOhub and TickSmith to bring IPOhub data on growth company IPOs to market with the help of TickSmith. EOSE will provide IPOhub with a team of data sales and business development professionals, while TickSmith will organise and distribute the data to end users. Suzanne Lock, EOSE founder and CEO,...

EVENT

Data Management Summit London

Now in its 12th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore the evolution of data strategy and how to leverage data to drive compliance and business insight.

GUIDE

ESG Data Handbook 2022

The ESG landscape is changing faster than anyone could have imagined even five years ago. With tens of trillions of dollars expected to have been committed to sustainable assets by the end of the decade, it’s never been more important for financial institutions of all sizes to stay abreast of changes in the ESG data...