The leading knowledge platform for the financial technology industry
The leading knowledge platform for the financial technology industry

A-Team Insight Blogs

A-Team Data Management Summit Outlines Elements of Optimal Data Management

Organising for optimal data management, transformation challenges, operational models, data quality, emerging technologies, data governance and, of course, regulation were top agenda items at A-Team Group’s recent Data Management Summit in New York City.

The first keynote speaker, David Luria, vice president, front office data intelligence programs at T. Rowe Price, set the scene for the day with a presentation entitled Un-Ready, Un-Willing and Un-Able: Why Most Data Projects Fail and How to Win Instead. Luria said most projects fail to deliver on their promise because they address the wrong problem, use the wrong resources or take the wrong approach. On the upside, he suggested successful data projects are built on checklists, people capabilities and communication, with checklists including enterprise objectives, programme strategy and scope.

With these issues in mind, the conference turned its attention to a panel session on organising for optimal data management. Chris Vickery, global head of enterprise data management at Nomura, noted the post-crisis focus on regulation and internal risk, and the subsequent push to centralise and govern data, a move that raises questions about where data governance fits into an organisation and how decisions are made.

Answering these questions, panel members said data governance should be a business function that drives value out of data assets, while big decisions on projects such as BCBS 239 compliance should be made at the top of the house. Contrary to some opinion, the panel agreed that regulation can help firms develop best practice and that BCBS 239 in particular is helping banks move towards optimal data management.

Reiterating the need for senior management engagement in big decisions and projects, a panel covering transformational challenges and new operational models discussed the trade off between strategic and tactical projects, and noted the continual need to convey the value delivered by data management to the business.

Considering the strategic versus tactical approach, Tom Wise, head of enterprise data management at GMO, described a strategic decision framework including four tests that is used by the firm to decide if a project is strategic. The tests question whether a proposed project is regulatory, will create a new app, will impact master data sources or is a convergence of numerous requests. If the project fits into one of these categories it becomes strategic.

A panel session looking specifically at KYC and client onboarding noted that both are becoming more difficult as regulatory data requirements continue to escalate. With industry past initial implementation, which poured resources into the problem, automation of KYC and onboarding processes has become essential, although as Bill Hauserman, senior director of compliance solutions at Bureau van Dijk, said, the intersection of technology and data can only be successful when data quality is high. Touching on the utility response to KYC, Matt Stauffer, CEO at DTCC Clarient, said adoption of the utility model has accelerated over recent months.

With change driven in great part by regulation, a panel of experts identified regulations that are proving particularly difficult to implement. Among them were Solvency II, Markets in Financial Instruments Directive II (MiFID II), Market Abuse Regulation (MAR) and the Fundamental Review of the Trading Book (FRTB), all of which are very demanding in terms of data and add complexity to data management.

BCBS 239 was also in the frame, with John Fleming, head of enterprise data governance at BNY Mellon, describing how the bank has implemented a data governance framework and data distribution hub to ensure compliance.

Spanning change, organisation and regulation, data quality and emerging technologies were subjects of robust discussion during panel sessions. The importance of data quality was emphasised time and time again, data quality initiatives were described, quality metrics considered and potential solutions such as data utilities, machine learning and big data management techniques examined.

A panel dedicated to emerging technologies and fintech focused on blockchain and found it to have significant potential, but only if data management is under control and security issues are resolved. Marc Alvarez, chief data officer at Mizuho, concluded: “Blockchain is just a better way to do what we do better. It may cause disruption, but it is not revolutionary, and a demo of a positive business model could cause a stampede.”

Related content

WEBINAR

Recorded Webinar: Managing LIBOR transition

The clock is ticking on the decommissioning of LIBOR towards the end of this year, leaving financial institutions with little time to identify where LIBOR is embedded in their processes, assess alternative benchmarks and reference rates, and ensure a smooth transition. Some will be ahead of others, but the challenges are significant for all as...

BLOG

New Compliance Solution from Plenitude Seeks to Boost Financial Crime Compliance

London-based risk and compliance specialist Plenitude this week launched a new cloud-based Financial Crime Compliance (FCC) solution called Plentitude RegSight, in a bid to clarify and streamline the complex world of FCC obligations management. In a world of increasing regulation and heightened regulatory scrutiny, the requirement for organisations to meet legal and regulatory obligations has...

EVENT

RegTech Summit Virtual

The RegTech Summit Virtual is a global online event that brings together an exceptional guest speaker line up of RegTech practitioners, regulators, start-ups and solution providers to collaborate and discuss innovative and effective approaches for building a better regulatory environment.

GUIDE

Entity Data Management Handbook – Seventh Edition

Sourcing entity data and ensuring efficient and effective entity data management is a challenge for many financial institutions as volumes of data rise, more regulations require entity data in reporting, and the fight again financial crime is escalated by bad actors using increasingly sophisticated techniques to attack processes and systems. That said, based on best...