About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Migration Problems Haven’t Changed in 10 Years, Say Datanomic and Iergo

Subscribe to our newsletter

The same problems that were occurring in the market 10 years ago around data migration are still with us today, according to compliance and data management solution vendor Datanomic and data migration focused vendor Iergo. The vendors discussed the importance of data quality tools in the migration process at an event in London yesterday.

Businesses are encountering the same obstacles and making the same mistakes with their data migration projects as they did 10 years ago, explained Johny Morris, CEO of Iergo, and Jonathan Pell, CEO of Datanomic. “Too many projects leave data migration to the end of the time line and underestimate how long it will take to resolve issues. The attitude of a ‘throw it at the database and see what sticks’ unfortunately still dominates most projects. There is also too little pre-emptive data analysis. Even with best practice models, too many projects rely on inventing their own approach, with not enough thought given to aligning the migration with the actual business needs,” said Morris.

The vendors pointed to research from Bloor, conducted in 2007, that indicates 84% of data migration activities fail to hit time or budget targets. These problems continue to occur in spite of the advances made in IT software, skills and project management. Data is delivered to wrong place at the wrong time and sometimes it is even the wrong data, said the speakers.

“In many instances of data migration failure, key data stakeholder analysis has not been performed at all or is simply not adequate. With most data migration projects, technology remains largely separate from the business until problems begin to manifest on the business side. Inadequate data preparation, poor business engagement and underestimating project complexity are three of the biggest issues we see time and again,” said Pell.

The problem is not just down to IT challenges; it is a business issue, according to Steve Tuck, chief strategy officer at Datanomic. Little attention is paid to data contents, some data is already broken and some gets lost or damaged in transit, he explained.

In order to tackle these issues, Tuck highlighted the potential uses for data quality tools to better understand, improve, control and protect the data. Data profiling tools can allow firms to discover potential issues with their data and allow users to collaborate with subject matter experts from the business to better understand the data inaccuracies. “Data profiling allows users to be able to validate business critical information with targeted rules, screen out dummy and default values, identify missing and mis-fielded data, spot minimum state issues and refine migration mappings,” he elaborated.

Firms should be able to improve their data sets with these tools by building rules from the data rather than documentation, said Tuck. They would also be enabled to clean and match logical entities to remove data duplication; a common problem across siloed systems. On an ongoing basis, data profiling tools are aimed at providing insight throughout the migration process. “Firms can set quality thresholds as well as ones based on the volume of data moved,” he added.

Obviously, the event was based around the promotion of Datanomic’s own data profiling based solution, which sits in competition with offerings of other vendors in the space such as DataFlux and Datactics. Earlier this year, the vendor launched a new version of its dn:Director solution, which is aimed at tracking data quality across an institution. Version 7.1 of the solution includes an upgraded front end for ease of use and additional functionality for third party products, according to the vendor.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Unlocking Transparency in Private Markets: Data-Driven Strategies in Asset Management

As asset managers continue to increase their allocations in private assets, the demand for greater transparency, risk oversight, and operational efficiency is growing rapidly. Managing private markets data presents its own set of unique challenges due to a lack of transparency, disparate sources and lack of standardization. Without reliable access, your firm may face inefficiencies,...

BLOG

Snowflake Retools Cortex to Offer FSI Tailored AI Capabilities

Snowflake’s Cortex AI features has been enriched to provide financial services companies with agentic artificial intelligence capabilities honed to their specific needs, the first of a planned suite of editions focused on individual industries. Cortex AI for Financial Services will feature all the functionality of the platform’s Cortex features but will offer clients large language models that...

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

GDPR Handbook

The May 25, 2018 compliance deadline of General Data Protection Regulation (GDPR) is approaching fast, requiring financial institutions to understand what personal data they hold, why they process it, and whether it is shared with other organisations. In line with individuals’ rights under the regulation, they must also provide access to individuals’ personal data and...