About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

A-Team Data Management Summit Outlines Elements of Optimal Data Management

Subscribe to our newsletter

Organising for optimal data management, transformation challenges, operational models, data quality, emerging technologies, data governance and, of course, regulation were top agenda items at A-Team Group’s recent Data Management Summit in New York City.

The first keynote speaker, David Luria, vice president, front office data intelligence programs at T. Rowe Price, set the scene for the day with a presentation entitled Un-Ready, Un-Willing and Un-Able: Why Most Data Projects Fail and How to Win Instead. Luria said most projects fail to deliver on their promise because they address the wrong problem, use the wrong resources or take the wrong approach. On the upside, he suggested successful data projects are built on checklists, people capabilities and communication, with checklists including enterprise objectives, programme strategy and scope.

With these issues in mind, the conference turned its attention to a panel session on organising for optimal data management. Chris Vickery, global head of enterprise data management at Nomura, noted the post-crisis focus on regulation and internal risk, and the subsequent push to centralise and govern data, a move that raises questions about where data governance fits into an organisation and how decisions are made.

Answering these questions, panel members said data governance should be a business function that drives value out of data assets, while big decisions on projects such as BCBS 239 compliance should be made at the top of the house. Contrary to some opinion, the panel agreed that regulation can help firms develop best practice and that BCBS 239 in particular is helping banks move towards optimal data management.

Reiterating the need for senior management engagement in big decisions and projects, a panel covering transformational challenges and new operational models discussed the trade off between strategic and tactical projects, and noted the continual need to convey the value delivered by data management to the business.

Considering the strategic versus tactical approach, Tom Wise, head of enterprise data management at GMO, described a strategic decision framework including four tests that is used by the firm to decide if a project is strategic. The tests question whether a proposed project is regulatory, will create a new app, will impact master data sources or is a convergence of numerous requests. If the project fits into one of these categories it becomes strategic.

A panel session looking specifically at KYC and client onboarding noted that both are becoming more difficult as regulatory data requirements continue to escalate. With industry past initial implementation, which poured resources into the problem, automation of KYC and onboarding processes has become essential, although as Bill Hauserman, senior director of compliance solutions at Bureau van Dijk, said, the intersection of technology and data can only be successful when data quality is high. Touching on the utility response to KYC, Matt Stauffer, CEO at DTCC Clarient, said adoption of the utility model has accelerated over recent months.

With change driven in great part by regulation, a panel of experts identified regulations that are proving particularly difficult to implement. Among them were Solvency II, Markets in Financial Instruments Directive II (MiFID II), Market Abuse Regulation (MAR) and the Fundamental Review of the Trading Book (FRTB), all of which are very demanding in terms of data and add complexity to data management.

BCBS 239 was also in the frame, with John Fleming, head of enterprise data governance at BNY Mellon, describing how the bank has implemented a data governance framework and data distribution hub to ensure compliance.

Spanning change, organisation and regulation, data quality and emerging technologies were subjects of robust discussion during panel sessions. The importance of data quality was emphasised time and time again, data quality initiatives were described, quality metrics considered and potential solutions such as data utilities, machine learning and big data management techniques examined.

A panel dedicated to emerging technologies and fintech focused on blockchain and found it to have significant potential, but only if data management is under control and security issues are resolved. Marc Alvarez, chief data officer at Mizuho, concluded: “Blockchain is just a better way to do what we do better. It may cause disruption, but it is not revolutionary, and a demo of a positive business model could cause a stampede.”

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Best practices for buy-side data management across structured and unstructured data

Data management is central to asset management, but it can also be a challenge as firms face increased volumes of data, data complexity and the need to consolidate structured and unstructured data to gain valuable insights, improve decision-making, step up customer acquisition and compliance, and ultimately, gain competitive advantage in a market characterised by tight...

BLOG

ESG Data Quality Still Dogs Asset Owners Even as They Boost Allocations

A lack of standardised and reliable data continues to pose a barrier to asset owners’ pursuit of ESG-linked investment strategies, according to a poll that nevertheless found surging incorporation of sustainability considerations in allocation decisions. The latest and third annual Voice of the Asset Owner Survey by Morningstar found that almost 40 per cent of...

EVENT

ESG Data & Tech Briefing London

The ESG Data & Tech Briefing will explore challenges around assembling and evaluating ESG data for reporting and the impact of regulatory measures and industry collaboration on transparency and standardisation efforts. Expert speakers will address how the evolving market infrastructure is developing and the role of new technologies and alternative data in improving insight and filling data gaps.

GUIDE

Regulatory Data Handbook 2024 – Twelfth Edition

Welcome to the twelfth edition of A-Team Group’s Regulatory Data Handbook, a unique and useful guide to capital markets regulation, regulatory change and the data and data management requirements of compliance. The handbook covers regulation in Europe, the UK, US and Asia-Pacific. This edition of the handbook includes a detailed review of acts, plans and...