Abstract: A composition of two or more clouds i.e. private and public that remains distinct entities but are bound together, providing advantages of number of deployment models is called Hybrid Cloud. Eliminating duplicate copies of repeating data to save storage can be managed by data deduplication which is a specialised data compression technique. Existing system shows that each user will be issued private keys for their corresponding privileges. These private keys can be used for generating file token for duplicate checking. However, during file uploading, the user needs to use file tokens for sharing with other users with privileges. To compute these file tokens, the user needs to know the private keys. This restriction leads the authorized deduplication system unable to be widely used and limited. This failure can be overcome by implementing block level deduplication which eliminates duplicate blocks of data that occur in non-identical files. New deduplication algorithms supporting authorized duplicate check in hybrid cloud using token number and privilege key are used such as SHA1 and AES algorithms. SHA-1 produces a 160-bit hash value known as message digest. AES algorithm will be used to convert a given plaintext of 256 bit into cipher text of 256 bits. Thus we will implement a deduplication construction supporting authorized duplicate check in hybrid cloud architecture.
Keywords: Data Owner Module, Encryption and Decryption module, Remote User Module, Cloud Server Module.