About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Migration Problems Haven’t Changed in 10 Years, Say Datanomic and Iergo

Subscribe to our newsletter

The same problems that were occurring in the market 10 years ago around data migration are still with us today, according to compliance and data management solution vendor Datanomic and data migration focused vendor Iergo. The vendors discussed the importance of data quality tools in the migration process at an event in London yesterday.

Businesses are encountering the same obstacles and making the same mistakes with their data migration projects as they did 10 years ago, explained Johny Morris, CEO of Iergo, and Jonathan Pell, CEO of Datanomic. “Too many projects leave data migration to the end of the time line and underestimate how long it will take to resolve issues. The attitude of a ‘throw it at the database and see what sticks’ unfortunately still dominates most projects. There is also too little pre-emptive data analysis. Even with best practice models, too many projects rely on inventing their own approach, with not enough thought given to aligning the migration with the actual business needs,” said Morris.

The vendors pointed to research from Bloor, conducted in 2007, that indicates 84% of data migration activities fail to hit time or budget targets. These problems continue to occur in spite of the advances made in IT software, skills and project management. Data is delivered to wrong place at the wrong time and sometimes it is even the wrong data, said the speakers.

“In many instances of data migration failure, key data stakeholder analysis has not been performed at all or is simply not adequate. With most data migration projects, technology remains largely separate from the business until problems begin to manifest on the business side. Inadequate data preparation, poor business engagement and underestimating project complexity are three of the biggest issues we see time and again,” said Pell.

The problem is not just down to IT challenges; it is a business issue, according to Steve Tuck, chief strategy officer at Datanomic. Little attention is paid to data contents, some data is already broken and some gets lost or damaged in transit, he explained.

In order to tackle these issues, Tuck highlighted the potential uses for data quality tools to better understand, improve, control and protect the data. Data profiling tools can allow firms to discover potential issues with their data and allow users to collaborate with subject matter experts from the business to better understand the data inaccuracies. “Data profiling allows users to be able to validate business critical information with targeted rules, screen out dummy and default values, identify missing and mis-fielded data, spot minimum state issues and refine migration mappings,” he elaborated.

Firms should be able to improve their data sets with these tools by building rules from the data rather than documentation, said Tuck. They would also be enabled to clean and match logical entities to remove data duplication; a common problem across siloed systems. On an ongoing basis, data profiling tools are aimed at providing insight throughout the migration process. “Firms can set quality thresholds as well as ones based on the volume of data moved,” he added.

Obviously, the event was based around the promotion of Datanomic’s own data profiling based solution, which sits in competition with offerings of other vendors in the space such as DataFlux and Datactics. Earlier this year, the vendor launched a new version of its dn:Director solution, which is aimed at tracking data quality across an institution. Version 7.1 of the solution includes an upgraded front end for ease of use and additional functionality for third party products, according to the vendor.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: How to simplify and modernize data architecture to unleash data value and innovation

The data needs of financial institutions are growing at pace as new formats and greater volumes of information are integrated into their systems. With this has come greater complexity in managing and governing that data, amplifying pain points along data pipelines. In response, innovative new streamlined and flexible architectures have emerged that can absorb and...

BLOG

Standards and Identifiers Help to Prevent ‘Data Chaos’: Webinar Preview

Financial institutions’ absorption of ever-greater volumes of data, and their utilisation of it in a surging number of use cases, is putting strains on their data management processes. Taking the friction out of those workflows can improve performance substantially. But the absence of a unified international set of standards to ensure all data used by...

EVENT

RegTech Summit London

Now in its 9th year, the RegTech Summit in London will bring together the RegTech ecosystem to explore how the European capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...