About a-team Marketing Services
The knowledge platform for the financial technology industry
The knowledge platform for the financial technology industry

A-Team Insight Blogs

Data Storage Group Receives Patent For Innovative Data Deduplication Technology

Subscribe to our newsletter

Data Storage Group, an industry leader in data backup and disaster recovery software, announced today that the United States Patent and Trademark Office has awarded the company US Patent 7,860,843 for the firm’s core data deduplication technology. DataStor’s unique and innovative software-based approach, known as Adaptive Content Factoring?, is a technological breakthrough offering significant advancements and operational efficiencies in data backup and archival storage for small to medium sized businesses (SMBs) up to large enterprises.

Today’s organisations face significant challenges meeting long term data retention requirements while maintaining compliance with numerous state and federal regulations and guidelines requiring firms to keep necessary information available in a useable fashion. Adding to this challenge is the expansive growth in digital information. Documents are richer in content and often reference related works, resulting in a tremendous amount of information to manage. The increasing volume, complexity, and costs of data backup and disaster recovery are causing many firms to rethink traditional data protection strategies, driving the need for innovative and affordable data management strategies which simplify and optimise data storage operations. By eliminating redundant data, deduplication is an essential part of the process of streamlining data backup and archival storage to increase efficiency and compliance while reducing costs.

Brian Dodd, CEO of Data Storage Group, commented on the issuing of the patent, “For DataStor this patent recognises our technological contribution to the industry and represents the culmination of years of hard work by a team of dedicated and very talented individuals. We are extremely pleased to receive this patent and to have the associated exclusive rights to offer this core foundation of groundbreaking technology to the industry.”

Mike Moore, company co-founder and CTO explains, “Unlike other, more typical deduplication technologies that chunk data into tiny blocks and require massive indexes to identify and manage common content, our elegant solution to the problem decreases backup storage requirements by efficiently identifying and eliminating sub-file redundancies at the source, thereby optimising the data before it’s transmitted across networks. This technology has demonstrated substantial increases in bandwidth utilisation, providing much quicker and more efficient backups – as much as 20 times faster than traditional backups.”

By distributing the source-side deduplication process across a network of computers, the power of distributed systems is harnessed for even greater levels of performance and scalability. Far less compute-intensive resources are required, and the solution scales for use in network configurations ranging from laptop computers up to large networks of enterprise servers. The technology also delivers a fully integrated virtual file system which allows users to easily restore, and even directly access, through standard interfaces, data for all managed points-in-time, empowering SMBs and enterprise users to meet their most stringent data storage and retention requirements, all at an affordable cost.

Subscribe to our newsletter

Related content

WEBINAR

Recorded Webinar: Are you making the most of the business-critical structured data stored in your mainframes?

Fewer than 30% of companies think that they can fully tap into their mainframe data even though complete, accurate and real-time data is key to business decision-making, compliance, modernisation and innovation. For many in financial markets, integrating data across the enterprise and making it available and actionable to everyone who needs it is extremely difficult....

BLOG

11 Providers Shaping the Capital Markets Data Governance Landscape

The vast volumes of data that capital markets participants are ingesting as a matter of necessity have placed new demands on their data estates. At a time of market volatility, increased regulatory scrutiny and growing requirements for real-time insights, keeping control of how their data is ingested, distributed and utilised has become a growing challenge....

EVENT

Buy AND Build: The Future of Capital Markets Technology

Buy AND Build: The Future of Capital Markets Technology London examines the latest changes and innovations in trading technology and explores how technology is being deployed to create an edge in sell side and buy side capital markets financial institutions.

GUIDE

Regulatory Data Handbook 2025 – Thirteenth Edition

Welcome to the thirteenth edition of A-Team Group’s Regulatory Data Handbook, a unique and practical guide to capital markets regulation, regulatory change, and the data and data management requirements of compliance across Europe, the UK, US and Asia-Pacific. This year’s edition lands at a moment of accelerating regulatory divergence and intensifying data focused supervision. Inside,...