About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Getting Data Right is Crucial to Deriving Value From AI: DMI Webinar Review

Subscribe to our newsletter

Capital markets participants are struggling with data sourcing and cleansing as they deploy artificial intelligence to streamline operations, improve customer relations and add value to their services, according to the latest A-Team Group poll.

In a survey survey of attendees at last week’s Data Management Insight webinar on data quality for AI it also emerged that more than half of respondents had yet to get their AI integration programmes beyond the planning stage, with a fifth yet even to consider such a move.

The results bore out a key theme of the broader discussion of the webinar, which was entitled “In Data We Trust – How to Ensure High-Quality Data to Power AI”: that only with good quality data can organisations expect to obtain valuable outputs from their AI applications.

The panellists agreed that poor-quality data can lead to flawed AI outputs resulting in operational inefficiencies, regulatory breaches and reputational damage.

Marla Dans, former head of data management and governance at Chicago Trading Company, said she was surprised by the high proportion of respondents who had yet to get their AI transformations off the ground, telling the webinar she would have expected more organisations to have “well established” programmes in place.

Mike Pickart, director, technical sales at Informatica, said the results bore out his experience that many organisations were much further along the AI integration journey than others.

Taking a separate view, Alpesh Doshi, managing partner at Redcliffe Capital, argued that the financial industry has been slow to take the steps necessary to improve the quality of their data.

That’s partly because of the cost of managing data and also because many companies have had difficulty making the business case for such an extensive endeavour. The solution, Doshi said, is likely to be found in the large language models (LLMs) that provide the backbone for generative AI (GenAI), which can provide a “shortcut” to improving data.

Getting it Right Early

The importance of assuring data quality at the outset of any AI implementation was illustrated by Pickart who related the experience of a prominent US firm that had to switch off a large GenAI initiative after five days because the flawed data being fed into it was causing issues that prevented it from performing its intended objectives.

Dans said that experiences like this showed that without trusted data, valuable business opportunities would be missed.

The panellists also observed a shift in perspective, with data governance now viewed as essential for unlocking the full potential of AI and creating a sustainable competitive advantage.

They noted, too, the growing board-level recognition of the strategic importance of data quality, driven by the promise of AI-powered innovation and the potential risks associated with inadequate data management. The increased focus on data is also elevating the role of the Chief Data Officer (CDO). Panellists noted that the CDO is increasingly becoming a strategic adviser to corporate boards, responsible for ensuring data integrity across organisations and aligning data strategy with business objectives.

The emergence of the Chief Data and Analytics Officer (CDAO) title further underscores the convergence of data management and advanced analytics. Dans cited the example of JP Morgan, where the CDAO now holds a board-level position on par with the CFO and COO.

Data Attributes

A number of factors are mitigating against a seamless AI rollout, the webinar heard. Among them are fragmented data setups across departments, cultural resistance to change within organisations and the challenges of managing unstructured data – a subject that will be dwelt upon in the next Data Management Insight webinar to be held on March 27.

Offering solutions, panellists agreed that there is a long list of data attributes that should be addressed to ensure good quality data. Accuracy was the priority of Doshi, who argued that without proper training, GenAI models cannot perform optimally and safely. The panel also stressed the importance of output explainability – one of the requirements of the new EU AI Act.

Pickart added that it was critical that organisations are able to monitor, understand and build controls around the data they are feeding into their AI models.

The panellists noted that establishing consistent definitions, formats and relationships between data elements is fundamental. They highlighted the growing importance of moving beyond traditional data cataloguing approaches to using semantic models and knowledge graphs for capturing the deeper meaning and connections within data. Advances in LLMs are making it increasingly feasible to automate and accelerate the process of data standardisation, reducing reliance on manual effort and traditional tools.

Casting an eye to the future, panellists said that organisations need to rethink their approach to data management and see it as a core part of their operations, something that could be more easily achieved through LLMs. This and other new technologies  should be embraced to ensure a robust data management strategy that works not only for AI but also for the entire enterprise’s workflows, the experts said.

  • Join us at our European showcase event, Data Management Summit London 2025, on March 20. Register here.
  • To catch up on this webinar, register here.
Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking value: Harnessing modern data platforms for data integration, advanced investment analytics, visualisation and reporting

Modern data platforms are bringing efficiencies, scalability and powerful new capabilities to institutions and their data pipelines. They are enabling the use of new automation and analytical technologies that are also helping firms to derive more value from their data and reduce costs. Use cases of specific importance to the finance sector, such as data...

BLOG

AI Emerges as Key Focus for the Buy-Side, Says SIX

Three years ago when Swiss financial data and market infrastructure provider SIX launched its first report together with Crisil Coalition Greenwich on the state of play within the buy-side, the subject of artificial intelligence barely made an appearance. Fast-forward to 2025, and AI dominates the latest report. AI is being deployed within a growing number...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...