About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Migration Problems Haven’t Changed in 10 Years, Say Datanomic and Iergo

Subscribe to our newsletter

The same problems that were occurring in the market 10 years ago around data migration are still with us today, according to compliance and data management solution vendor Datanomic and data migration focused vendor Iergo. The vendors discussed the importance of data quality tools in the migration process at an event in London yesterday.

Businesses are encountering the same obstacles and making the same mistakes with their data migration projects as they did 10 years ago, explained Johny Morris, CEO of Iergo, and Jonathan Pell, CEO of Datanomic. “Too many projects leave data migration to the end of the time line and underestimate how long it will take to resolve issues. The attitude of a ‘throw it at the database and see what sticks’ unfortunately still dominates most projects. There is also too little pre-emptive data analysis. Even with best practice models, too many projects rely on inventing their own approach, with not enough thought given to aligning the migration with the actual business needs,” said Morris.

The vendors pointed to research from Bloor, conducted in 2007, that indicates 84% of data migration activities fail to hit time or budget targets. These problems continue to occur in spite of the advances made in IT software, skills and project management. Data is delivered to wrong place at the wrong time and sometimes it is even the wrong data, said the speakers.

“In many instances of data migration failure, key data stakeholder analysis has not been performed at all or is simply not adequate. With most data migration projects, technology remains largely separate from the business until problems begin to manifest on the business side. Inadequate data preparation, poor business engagement and underestimating project complexity are three of the biggest issues we see time and again,” said Pell.

The problem is not just down to IT challenges; it is a business issue, according to Steve Tuck, chief strategy officer at Datanomic. Little attention is paid to data contents, some data is already broken and some gets lost or damaged in transit, he explained.

In order to tackle these issues, Tuck highlighted the potential uses for data quality tools to better understand, improve, control and protect the data. Data profiling tools can allow firms to discover potential issues with their data and allow users to collaborate with subject matter experts from the business to better understand the data inaccuracies. “Data profiling allows users to be able to validate business critical information with targeted rules, screen out dummy and default values, identify missing and mis-fielded data, spot minimum state issues and refine migration mappings,” he elaborated.

Firms should be able to improve their data sets with these tools by building rules from the data rather than documentation, said Tuck. They would also be enabled to clean and match logical entities to remove data duplication; a common problem across siloed systems. On an ongoing basis, data profiling tools are aimed at providing insight throughout the migration process. “Firms can set quality thresholds as well as ones based on the volume of data moved,” he added.

Obviously, the event was based around the promotion of Datanomic’s own data profiling based solution, which sits in competition with offerings of other vendors in the space such as DataFlux and Datactics. Earlier this year, the vendor launched a new version of its dn:Director solution, which is aimed at tracking data quality across an institution. Version 7.1 of the solution includes an upgraded front end for ease of use and additional functionality for third party products, according to the vendor.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Navigating a Complex World: Best Data Practices in Sanctions Screening

As rising geopolitical uncertainty prompts an intensification in the complexity and volume of global economic and financial sanctions, banks and financial institutions are faced with a daunting set of new compliance challenges. The risk of inadvertently engaging with sanctioned securities has never been higher and the penalties for doing so are harsh. Traditional sanctions screening...

BLOG

Juniper Square Seeks to Democratise Private Markets with Data

Juniper Square has, from a virtual standing start, become one of the fastest-growing providers of data and investor services to private-market participants. Earlier in the summer it received a US$130 million series D capital injection that underscored its prospects and valued the company within unicorn territory. That’s unsurprising for a company whose platform has, since...

EVENT

TradingTech Summit London

Now in its 15th year the TradingTech Summit London brings together the European trading technology capital markets industry and examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...