About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A-Team Data Management Summit Outlines Elements of Optimal Data Management

Subscribe to our newsletter

Organising for optimal data management, transformation challenges, operational models, data quality, emerging technologies, data governance and, of course, regulation were top agenda items at A-Team Group’s recent Data Management Summit in New York City.

The first keynote speaker, David Luria, vice president, front office data intelligence programs at T. Rowe Price, set the scene for the day with a presentation entitled Un-Ready, Un-Willing and Un-Able: Why Most Data Projects Fail and How to Win Instead. Luria said most projects fail to deliver on their promise because they address the wrong problem, use the wrong resources or take the wrong approach. On the upside, he suggested successful data projects are built on checklists, people capabilities and communication, with checklists including enterprise objectives, programme strategy and scope.

With these issues in mind, the conference turned its attention to a panel session on organising for optimal data management. Chris Vickery, global head of enterprise data management at Nomura, noted the post-crisis focus on regulation and internal risk, and the subsequent push to centralise and govern data, a move that raises questions about where data governance fits into an organisation and how decisions are made.

Answering these questions, panel members said data governance should be a business function that drives value out of data assets, while big decisions on projects such as BCBS 239 compliance should be made at the top of the house. Contrary to some opinion, the panel agreed that regulation can help firms develop best practice and that BCBS 239 in particular is helping banks move towards optimal data management.

Reiterating the need for senior management engagement in big decisions and projects, a panel covering transformational challenges and new operational models discussed the trade off between strategic and tactical projects, and noted the continual need to convey the value delivered by data management to the business.

Considering the strategic versus tactical approach, Tom Wise, head of enterprise data management at GMO, described a strategic decision framework including four tests that is used by the firm to decide if a project is strategic. The tests question whether a proposed project is regulatory, will create a new app, will impact master data sources or is a convergence of numerous requests. If the project fits into one of these categories it becomes strategic.

A panel session looking specifically at KYC and client onboarding noted that both are becoming more difficult as regulatory data requirements continue to escalate. With industry past initial implementation, which poured resources into the problem, automation of KYC and onboarding processes has become essential, although as Bill Hauserman, senior director of compliance solutions at Bureau van Dijk, said, the intersection of technology and data can only be successful when data quality is high. Touching on the utility response to KYC, Matt Stauffer, CEO at DTCC Clarient, said adoption of the utility model has accelerated over recent months.

With change driven in great part by regulation, a panel of experts identified regulations that are proving particularly difficult to implement. Among them were Solvency II, Markets in Financial Instruments Directive II (MiFID II), Market Abuse Regulation (MAR) and the Fundamental Review of the Trading Book (FRTB), all of which are very demanding in terms of data and add complexity to data management.

BCBS 239 was also in the frame, with John Fleming, head of enterprise data governance at BNY Mellon, describing how the bank has implemented a data governance framework and data distribution hub to ensure compliance.

Spanning change, organisation and regulation, data quality and emerging technologies were subjects of robust discussion during panel sessions. The importance of data quality was emphasised time and time again, data quality initiatives were described, quality metrics considered and potential solutions such as data utilities, machine learning and big data management techniques examined.

A panel dedicated to emerging technologies and fintech focused on blockchain and found it to have significant potential, but only if data management is under control and security issues are resolved. Marc Alvarez, chief data officer at Mizuho, concluded: “Blockchain is just a better way to do what we do better. It may cause disruption, but it is not revolutionary, and a demo of a positive business model could cause a stampede.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Navigating a Complex World: Best Data Practices in Sanctions Screening

As rising geopolitical uncertainty prompts an intensification in the complexity and volume of global economic and financial sanctions, banks and financial institutions are faced with a daunting set of new compliance challenges. The risk of inadvertently engaging with sanctioned securities has never been higher and the penalties for doing so are harsh. Traditional sanctions screening...

BLOG

Northern Trust Highlights Asset Owners’ Data Challenge in Private Markets

Much is spoken of the data challenges that institutional asset managers are facing as they redraw their business models to meet the demands of a new economic environment, but less is said of asset owners, who are undergoing their own operational transformations. For them, the data journey is just as challenging; as their operational models...

EVENT

Data Management Summit New York City

Now in its 15th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Enterprise Data Management

The current financial crisis has highlighted that financial institutions do not have a sufficient handle on their data and has prompted many of these institutions to re-evaluate their approaches to data management. Moreover, the increased regulatory scrutiny of the financial services community during the past year has meant that data management has become a key...