About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Datactics Makes Data Assets More Valuable with Renewed Augmented Data Quality Solution

Subscribe to our newsletter

Datactics has released the first phase of its renewed Augmented Data Quality (ADQ) solution including an enhanced user interface and a host of new and improved machine learning (ML) microservices to automate data quality activities. The solution also eliminates time and the need for technical knowledge when addressing data quality issues by democratising data quality management through greater automation and no-code tooling.

Customers of Datactics’ self-service rules-based data quality solution will be upgraded to ADQ, and new customers will be brought in at this level, as the solution is rolled out with additional functionality ahead of a major release in Q3 or Q4 this year. Meantime, Datactics is working with a handful of customers in the UK and US to deploy elements of the product and demonstrate results.

“ADQ couples the power of AI augmentation and automation to create business value by reducing the manual effort of achieving data quality, increasing data accuracy, and providing enhanced insight into data,” says Datactics CEO Stuart Harvey. “Essentially, it provides a pragmatic and practical real-world understanding of data quality.”

Datactics describes augmented data quality as an approach that implements advanced algorithms, ML and AI to automate data quality management. Its goal is to correct data, learn from this and automatically adapt and improve data quality over time, making data assets more valuable to the business. It does not, however, eliminate the need for a human in the loop providing oversight, decision making and any necessary intervention.

ADQ’s enhanced user interface ties data processes together to cover the journey from onboarding to data quality and data remediation. It includes no-code tooling to allow business users to access and work with data.

Augmentation focuses on minor, but important, processes, says Fiona Browne, head of software development and machine learning at Datactics. An example here is the inclusion of time series data in ADQ that can be used to identify problematic data sources, breaks and root causes, and provide an understanding of what might cause problems in future. “Mining this information can be valuable for CDOs,” says Browne.

Looking at ADQ ML microservices and enhancing the company’s previous rules-based data quality solutions that depended on manual selection of rules from the Datactics’ library, a new microservice reviews a data source and automatically suggests data quality rules that could be used against the data. This can significantly reduce time to data quality, says Harvey. Additional microservices include automated outlier detection, and analytics capabilities to further streamline data quality workflows.

Beyond the initial release of ADQ, Browne says later phases will build in more automation for other data quality processes. Meantime, Datactics is also looking at generative AI for data quality – and concludes that it could be very useful.

Subscribe to our newsletter

Related content


Upcoming Webinar: How to maximise the use of data standards and identifiers beyond compliance and in the interests of the business

Date: 18 July 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Data standards and identifiers have become common currency in regulatory compliance, bringing with them improved transparency, efficiency and data quality in reporting. They also contribute to automation. But their value does not end here, with data standards and identifiers...


EDM Council Introduces Data Excellence Program

The EDM Council has introduced a Data Excellence Program offering standardised measurement and recognition of data management excellence at the organisational level. The initiative aims to acknowledge organisations that are dedicated to continuous improvement and excellence in data management based on globally recognised best practices. Key elements of the Data Excellence Program include: Data management...


RegTech Summit New York

Now in its 8th year, the RegTech Summit in New York will bring together the regtech ecosystem to explore how the North American capital markets financial industry can leverage technology to drive innovation, cut costs and support regulatory change.


Evaluated Pricing

Valuations and pricing teams are facing a much higher degree of scrutiny from both the regulatory community and the investor community in the glare of the post-crisis data transparency spotlight. Fair value price transparency requirements and the gradual move towards a more harmonised accounting standards environment is set within the context of the whole debate...