About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Facteus Partners Snowflake to Migrate Sensitive Data to the Snowflake Data Cloud

Subscribe to our newsletter

Facteus, a provider of actionable insights from sensitive data, has joined Snowflake’s partner network with a focus on enabling financial services organisations to use its Mimic synthetic data engine and data enrichment services to transform, enhance, and migrate sensitive data to the Snowflake Data Cloud.

Mimic is installed behind an organisation’s firewall and uses proprietary machine learning technology to create a synthetic copy of sensitive data, such as raw transaction data. This data transformation process removes sensitive personally identifiable information (PII), while maintaining statistical relevance back to the original data source. The synthetic data cannot be reverse engineered back to the original transaction or organisation.

Facteus’ data enrichment services cleanse, enhance and prepare the synthetic data for use in analytics, machine learning and AI, marketing and segmentation activities, and data monetisation to create new revenue streams.

“Financial services organisations have wrestled with how to safely and securely migrate crucial systems and data to cloud-based environments. Many organisations lack the resources to make the conversion or remain hesitant to store sensitive data in the cloud,” says Chris Marsh, CEO at Facteus. “Through this partnership, organisations can transform their data into privacy-compliant synthetic data that can be safely migrated to Snowflake’s Data Cloud and take advantage of the available computing power, analytics tools, and data marketplace.”

Matt Glickman, vice president customer product strategy, financial services at Snowflake, adds:  “The combination of Facteus’ platform with Snowflake’s Data Cloud enables financial services firms to safely and securely transform their organisation through data.”

Subscribe to our newsletter

Related content


Recorded Webinar: How to automate entity data management and due diligence to ensure efficiency, accuracy and compliance

Requesting, gathering, analysing and monitoring customer, vendor and partner entity data is time consuming, and often a tedious manual process. This can slow down customer relationships and expose financial institutions to risk from inaccurate, incomplete or outdated data – but there are solutions to these problems. This webinar will consider the challenges of sourcing and...


The Hidden Cost of Bad Data – Why Accuracy Pays Off

By Ariel Junqueira-DeGarcia, Strategy and Technology Leader at Broadridge. In the financial world, data is king. But many argue that firms are dangerously reliant on data they inadequately understand, unknowingly wielding a double-edged sword that can just as easily enrich as it can erode. This article pulls back the curtain on the murky world of...


Data Management Summit New York City

Now in its 14th year the Data Management Summit NYC brings together the North American data management community to explore how data strategy is evolving to drive business outcomes and speed to market in changing times.


The Reference Data Utility Handbook

The potential of a reference data utility model has been discussed for many years, and while early implementations failed to gain traction, the model has now come of age as financial institutions look for new data management models that can solve the challenges of operational cost reduction, improved data quality and regulatory compliance. The multi-tenanted...