About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Migration Problems Haven’t Changed in 10 Years, Say Datanomic and Iergo

Subscribe to our newsletter

The same problems that were occurring in the market 10 years ago around data migration are still with us today, according to compliance and data management solution vendor Datanomic and data migration focused vendor Iergo. The vendors discussed the importance of data quality tools in the migration process at an event in London yesterday.

Businesses are encountering the same obstacles and making the same mistakes with their data migration projects as they did 10 years ago, explained Johny Morris, CEO of Iergo, and Jonathan Pell, CEO of Datanomic. “Too many projects leave data migration to the end of the time line and underestimate how long it will take to resolve issues. The attitude of a ‘throw it at the database and see what sticks’ unfortunately still dominates most projects. There is also too little pre-emptive data analysis. Even with best practice models, too many projects rely on inventing their own approach, with not enough thought given to aligning the migration with the actual business needs,” said Morris.

The vendors pointed to research from Bloor, conducted in 2007, that indicates 84% of data migration activities fail to hit time or budget targets. These problems continue to occur in spite of the advances made in IT software, skills and project management. Data is delivered to wrong place at the wrong time and sometimes it is even the wrong data, said the speakers.

“In many instances of data migration failure, key data stakeholder analysis has not been performed at all or is simply not adequate. With most data migration projects, technology remains largely separate from the business until problems begin to manifest on the business side. Inadequate data preparation, poor business engagement and underestimating project complexity are three of the biggest issues we see time and again,” said Pell.

The problem is not just down to IT challenges; it is a business issue, according to Steve Tuck, chief strategy officer at Datanomic. Little attention is paid to data contents, some data is already broken and some gets lost or damaged in transit, he explained.

In order to tackle these issues, Tuck highlighted the potential uses for data quality tools to better understand, improve, control and protect the data. Data profiling tools can allow firms to discover potential issues with their data and allow users to collaborate with subject matter experts from the business to better understand the data inaccuracies. “Data profiling allows users to be able to validate business critical information with targeted rules, screen out dummy and default values, identify missing and mis-fielded data, spot minimum state issues and refine migration mappings,” he elaborated.

Firms should be able to improve their data sets with these tools by building rules from the data rather than documentation, said Tuck. They would also be enabled to clean and match logical entities to remove data duplication; a common problem across siloed systems. On an ongoing basis, data profiling tools are aimed at providing insight throughout the migration process. “Firms can set quality thresholds as well as ones based on the volume of data moved,” he added.

Obviously, the event was based around the promotion of Datanomic’s own data profiling based solution, which sits in competition with offerings of other vendors in the space such as DataFlux and Datactics. Earlier this year, the vendor launched a new version of its dn:Director solution, which is aimed at tracking data quality across an institution. Version 7.1 of the solution includes an upgraded front end for ease of use and additional functionality for third party products, according to the vendor.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: An Agile Approach to Investment Management Platforms for Private Markets and the Total Portfolio View

Data and operations professionals at private market institutions face significant data and analytical challenges managing private assets data. With investors clamouring for advice and analysis of private markets in their search for returns, investment managers are looking at ways to gain a more meaningful view of risk and performance across all asset types held by...

BLOG

Modernisation of Investment Accounting Rises in Importance Amid New Pressures

Investment accounting is moving up the data management agenda as regulatory pressure and investor demands collide with the limits of legacy systems, and as new technology makes real-time, enterprise-wide accuracy achievable at scale. Getting that right, however, requires planning and the careful selection of expert partners, argues Lior Yogev, chief executive at FundGuard. “When it’s...

EVENT

Data Management Summit London

Now in its 16th year, the Data Management Summit (DMS) in London brings together the European capital markets enterprise data management community, to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

AI in Capital Markets: Practical Insight for a Transforming Industry – Free Handbook

AI is no longer on the horizon – it’s embedded in the infrastructure of modern capital markets. But separating real impact from inflated promises requires a grounded, practical understanding. The AI in Capital Markets Handbook 2025 provides exactly that. Designed for data-driven professionals across the trade life-cycle, compliance, infrastructure, and strategy, this handbook goes beyond...