About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Migration Problems Haven’t Changed in 10 Years, Say Datanomic and Iergo

Subscribe to our newsletter

The same problems that were occurring in the market 10 years ago around data migration are still with us today, according to compliance and data management solution vendor Datanomic and data migration focused vendor Iergo. The vendors discussed the importance of data quality tools in the migration process at an event in London yesterday.

Businesses are encountering the same obstacles and making the same mistakes with their data migration projects as they did 10 years ago, explained Johny Morris, CEO of Iergo, and Jonathan Pell, CEO of Datanomic. “Too many projects leave data migration to the end of the time line and underestimate how long it will take to resolve issues. The attitude of a ‘throw it at the database and see what sticks’ unfortunately still dominates most projects. There is also too little pre-emptive data analysis. Even with best practice models, too many projects rely on inventing their own approach, with not enough thought given to aligning the migration with the actual business needs,” said Morris.

The vendors pointed to research from Bloor, conducted in 2007, that indicates 84% of data migration activities fail to hit time or budget targets. These problems continue to occur in spite of the advances made in IT software, skills and project management. Data is delivered to wrong place at the wrong time and sometimes it is even the wrong data, said the speakers.

“In many instances of data migration failure, key data stakeholder analysis has not been performed at all or is simply not adequate. With most data migration projects, technology remains largely separate from the business until problems begin to manifest on the business side. Inadequate data preparation, poor business engagement and underestimating project complexity are three of the biggest issues we see time and again,” said Pell.

The problem is not just down to IT challenges; it is a business issue, according to Steve Tuck, chief strategy officer at Datanomic. Little attention is paid to data contents, some data is already broken and some gets lost or damaged in transit, he explained.

In order to tackle these issues, Tuck highlighted the potential uses for data quality tools to better understand, improve, control and protect the data. Data profiling tools can allow firms to discover potential issues with their data and allow users to collaborate with subject matter experts from the business to better understand the data inaccuracies. “Data profiling allows users to be able to validate business critical information with targeted rules, screen out dummy and default values, identify missing and mis-fielded data, spot minimum state issues and refine migration mappings,” he elaborated.

Firms should be able to improve their data sets with these tools by building rules from the data rather than documentation, said Tuck. They would also be enabled to clean and match logical entities to remove data duplication; a common problem across siloed systems. On an ongoing basis, data profiling tools are aimed at providing insight throughout the migration process. “Firms can set quality thresholds as well as ones based on the volume of data moved,” he added.

Obviously, the event was based around the promotion of Datanomic’s own data profiling based solution, which sits in competition with offerings of other vendors in the space such as DataFlux and Datactics. Earlier this year, the vendor launched a new version of its dn:Director solution, which is aimed at tracking data quality across an institution. Version 7.1 of the solution includes an upgraded front end for ease of use and additional functionality for third party products, according to the vendor.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Strategies, tools and techniques to extract value from unstructured data

Unstructured data is voluminous, unwieldy and difficult to store and manage. For capital markets participants, it is also key to generating business insight, making better-informed decisions and delivering both internal and external value. Solving this dichotomy can be a challenge, but there are solutions designed to help financial institutions digest, manage and make best use...

BLOG

FINBOURNE Technology Expands Integration with Taskize to Enhance Post-Trade Operational Workflows

FINBOURNE Technology, the investment data management solutions provider, has expanded its integration with Taskize, the post-trade operational workflow specialists, to improve the management of complex IBOR (Investment Book of Record) and ABOR (Accounting Book of Record) post-trade exceptions. For FINBOURNE clients, the integration with Taskize aims to reduce resolution times, lower human error, and boost...

EVENT

Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.

GUIDE

Regulatory Data Handbook 2019/2020 – Seventh Edition

Welcome to A-Team Group’s best read handbook, the Regulatory Data Handbook, which is now in its seventh edition and continues to grow in terms of the number of regulations covered, the detail of each regulation and the impact that all the rules and regulations will have on data and data management at your institution. This...