An Optimized Data Duplication Strategy for Cloud Computing: Dedup with ABE and Bloom Filters

  • Nipun Chhabra et al.

Abstract

In the present era, cloud computing has become very popular mainly because of its ease of access,
unlimited data storage and on demand payment facility. Cloud's elastic provisioning capability gives
access to additional storage space whenever required therefore many organizations prefer to store
their data in cloud as it is readily available anywhere. It has been observed most of the time multiple
copies of the same files/data are uploaded by the user leading to the loss of expensive storage and
reduced bandwidth. To address this issue many data deduplication techniques were rendered which
remove redundant information from any dataset using compression techniques. Deduplication
releases a lot of free storage space when it is performed over big data volumes reducing the amount
of data to transmit across the network and can save significant money in terms of storage costs,
bandwidth cost and backup speed. Data deduplication techniques ensure that only one unique
instance of data is retained on storage and any other user who tries to update the same data will be
given a reference or pointer to the originally stored copy. In this paper a novel technique has been
proposed for secure deduplication using probabilistic data structure which is memory and time
efficient, known as bloom filters. Work pertaining to the use of Attributes based encryption along with
Bloom Filters for Deduplication, is not available. Results indicate that the proposed algorithm has
excelled existing traditional encryption with hashing technique for deduplication and it has been
empirically proven with experimental/simulated results in this paper.
Keywords: Deduplication, Access Control on encryp

Published
2020-02-16
Section
Articles