5

Click here to load reader

DIFFERENTIAL AUTHORIZATION DUPLICATE CHECK WITH HYBRID ... · PDF fileDIFFERENTIAL AUTHORIZATION DUPLICATE CHECK WITH HYBRID CLOUD APPROACH ... In hybrid cloud design authorized duplicate

  • Upload
    vuthuan

  • View
    213

  • Download
    1

Embed Size (px)

Citation preview

Page 1: DIFFERENTIAL AUTHORIZATION DUPLICATE CHECK WITH HYBRID ... · PDF fileDIFFERENTIAL AUTHORIZATION DUPLICATE CHECK WITH HYBRID CLOUD APPROACH ... In hybrid cloud design authorized duplicate

www.gjaet.com Page | 118

Global Journal of Advanced Engineering Technologies Volume 5, Issue 2- 2016 ISSN (Online): 2277-6370 & ISSN (Print):2394-0921

DIFFERENTIAL AUTHORIZATION DUPLICATE CHECK WITH HYBRID CLOUD APPROACH

1SushmaBalajiAdsul, 2Mr. Sujeet More1M.E. Student, Department of CSE, Jawaharlal Nehru College of Engineering, Aurangabad,, Maharashtra, India.

2Assistant Professor, Department of CSE, Jawaharlal Nehru College of Engineering, Aurangabad, Maharashtra, India.

Abstract: Many techniques are exploitation for the elimination of duplicate copies of repeating information, from those techniques, one amongst the vital data compression technique is data duplication. several benefits with this data duplication, chiefly it'll reduce the quantity of storage space and save the information measure once exploitation in cloud storage. to guard confidentiality of the sensitive knowledge whereas supporting de-duplication knowledge is encrypted by the projected convergent encryption technique before out sourcing. issues authorized data duplication formally self-addressed by the primary try of this paper for higher protection of information security. this can be completely different from the normal duplication systems. The differential privileges of users ar any thought-about in duplicate check besides the data itself. In hybrid cloud design authorized duplicate check supported by many new duplication constructions. supported the definitions specified in the projected security model, our theme is secure. Proof of the construct enforced during this paper by conducting test-bed experiments.Keywords: Deduplication, authorized duplicate check, confidentiality, hybrid cloud.

I. INTRODUCTIONIn cloud computing data de-duplication may be a essential data compression mechanism for reducing identical copies of same data. This mechanism is employed to boost effective use of storage space and additionally applied to attenuate data transmission over network in de-duplication methodology identical dataare notice and hold on throughout method of study. As method continues different data are matched to the hold on copy and whenever matched found the identical datais replaced with a tiny low reference that self-addressed to hold on data. A hybrid cloud may be a combination of personal cloud and public cloud which is most crucial that resides on a personal cloud and therefore the data which is well accessible is resides on a public cloud hybrid cloud is useful for responsibility, extensibility and quick readying and price saving of public cloud with a lot of security with non-public

cloud. The complicated challenge of cloud storage or cloud computing is that the arrangement of huge volume of data duplication may be a method of eliminating of duplicate data in deduplication techniques redundant data removed exploit single instance of the info to be hold on. within the previous recent system the info is encrypted back to outsourcing it on the cloud or network. This encoding needs most time similarly as storage space demand to encrypt the info if there's great deal of information at that point encoding method becomes complicated and demanding. By mistreatment de-duplication technique in hybrid cloud the encoding technique become less complicated. As we tend to all of is aware of that the network has great deal of information that being shared by several users. several massive networks uses data cloud to store the data and share that data on the network. As cloud computing becomes rife, associate degree increasing quantity of information is being stored within the cloud and shared by users with specified privileges, that summarize the access rights of the hold on data. One major challenge of cloud storage services is that the management of the ever-increasing volume of information. to create data management ascendable in cloud computing, deduplication has been a well known technique and has attracted a lot of and a lot of focus recently. data deduplication may be a specialized datacompression technique for removing duplicate copies of continuance data in storage. The technique is employed to boost storage usage and might even be applied to network data transfers to scale back the amount of bytes that has to be sent. rather than keeping multiple data copies with constant content, deduplication eliminates redundant data by keeping only one physical copy and referring different redundant data thereto copy. Deduplication will happen at either the file level or the block level. For file level deduplication, it eliminates duplicate copies of constant file. Deduplication may happen at the block level, that removes duplicate blocks of information that occur in non-identical files.

Page 2: DIFFERENTIAL AUTHORIZATION DUPLICATE CHECK WITH HYBRID ... · PDF fileDIFFERENTIAL AUTHORIZATION DUPLICATE CHECK WITH HYBRID CLOUD APPROACH ... In hybrid cloud design authorized duplicate

www.gjaet.com Page | 119

Global Journal of Advanced Engineering Technologies Volume 5, Issue 2- 2016 ISSN (Online): 2277-6370 & ISSN (Print):2394-0921

II. RELATED WORKThen again, past deduplication frameworks can't bolster differential approval copy check, which is vital in numerous applications. In such an approved deduplication framework, every client is issued an arrangement of benefits amid framework instatement. Every record suggests to the cloud is likewise limited by an arrangement of benefits to determine which kind of clients is allowed to perform the copy check and get to the documents. Before presenting his copy check demand for some record, the client needs to take this document and his own benefits as inputs. The client can locate a copy for this record if and just if there is a duplicate of this document and a coordinated benefit put away in cloud. For instance, in an society, numerous diverse benefits will be relegated to workers. Keeping in mind the end goal to spare expense and proficiently administration, the information will be moved to the capacity server supplier (S-CSP) in general society cloud with indicated benefits and the deduplication system will be connected to store one and only duplicate of the same record. As a result of security thought, a few documents will be scrambled and permitted the copy check by workers with indicated benefits to understand the entrance control. Conventional deduplication frameworks in light of joined encryption, despite the fact that giving secrecy to some degree, try not to bolster the copy check with differential benefits. At the end of the day, no differential benefits have been considered in the deduplication in light of joined encryption method. It is by all accounts negated in the event that we need to acknowledge both deduplication and differential approval copy check in the meantime.A) Symmetric Encryption:Symmetric encryption uses a common secret key Κ to encrypt and decrypt data. A symmetric encryption scheme consists of 3 major functions:

· KeyGenSE(1λ )= Κ is the key generation algorithm that generates Κ using security parameter 1λ.

· EncSE(Κ,M)= C is the symmetric encryption algorithm that takes the secret Κ and message M and then outputs the ciphertext C.

· DecSE(Κ,C)= M is the symmetric decryption algorithm that takes the secret Κ and ciphertext C and then outputs the original message M.

B) Convergent Encryption :Convergent encryption offers data confidentiality in deduplication. A user (or data owner) derives a

convergent key from every original data copy & encrypts the data copy with the convergent key. In addition, the user also derives a tag for the data copy, such that the tag will be helped to find duplicates. Here, we assume that the tag correctness property holds, i.e., if we have two similar data copies, then their tags are the same. To detect duplicates, the user first sends the tag to the server side to verify if the identical copy has been already stored. Note that both the convergent key and the tag are lonely derived, and the tag cannot be used to deduce the convergent key and compromise data confidentiality. Both the encrypted data copy and its equivalent tag will be stored on the server side. C) Proof of Ownership:The proof of ownership (PoW) concept enables users to prove their ownership of data copies to the storage server. Specifically, PoW is achieved as an interactive algorithm (denoted by PoW) run by a prover (i.e., user) and a verifier (i.e., storage server). The verifier derives a short value Φ (M) from a data copy M. To prove the ownership of the data copy M, the prover needs to send Φ’ to the verifier such that Φ’ = Φ (M). The formal security definition for PoW roughly follows the threat model in a fulfilled distribution network, where an attacker does not know the overall file, but has collaborates who have the file. The accomplices follow the “bounded retrieval model”, such that they can help the attacker get the file, subject to the constraint that they must send fewer bits than the initial min-entropy of the file to the attacker.

III. FRAME WORK· Cloud User: A cloud user is that who needs to

outsource data on public storage that acts as a public cloud in cloud computing. A system provides authenticate accustomed enter in system upload data with specific set of privileges for more accessing the uploaded data to download.

· Public Storage: Public Storage is an storage disk enable which permit} to store the users data on its with include of authorized and not allow to transfer the duplicate data. therefore save storage space and bandwidth of transmission. This uploaded data is in encrypted type, solely a user with individual key will decrypt it.

· Private Cloud: a private cloud acts as a proxy to permit each data owner and user to securely perform duplicate check beside differential privileges.

Page 3: DIFFERENTIAL AUTHORIZATION DUPLICATE CHECK WITH HYBRID ... · PDF fileDIFFERENTIAL AUTHORIZATION DUPLICATE CHECK WITH HYBRID CLOUD APPROACH ... In hybrid cloud design authorized duplicate

www.gjaet.com Page | 120

Global Journal of Advanced Engineering Technologies Volume 5, Issue 2- 2016 ISSN (Online): 2277-6370 & ISSN (Print):2394-0921

· Auditor: Auditor could be a TPA work as experience and capabilities wherever cloud users don't have to trust to assess the cloud storage service reliableness on behalf of the user upon request.

The set of privileges and also the rhombohedra key for every privilege is assigned and hold on privately cloud. The user registers into the system, privileges square measure assigned to user in step with identity given by the user at registration time; means that on basis of post that access by the user. the data owner with a privilege issued set needs to transfer and share a file to users, more the info owner performs identification and sends the file tag to the personal server. personal cloud server verifies the info owner and computes the file token and can remand the token to the info owner. the info owner sends this file token and asking to upload a file to the storage provider. If duplicate file is detected then user must run the prisoner of war protocol with the storage provider to prove he/she has associate ownership of individual file. within the PoW result; if proof of possession of file is passed then user are going to be provided a pointer for that file. And on the second case; for the user to transfer file. If the results of signature verification are passed, personal cloud can calculate the file token with every privileges from the privilege set given by the user, which is able to be came back to the user.

Figure 1: System Model for Authorized DeduplicationFinally user computes the encryption. User encrypts the file with a key and also the key is encrypted into cipher text with every key within the file token given by the private cloud server. Next the user uploads the

encrypted file, file tag, encrypted key. Suppose user needs to download file F. The user 1st uses his key to decrypt the encrypted key and obtain key K. Then the user uses to recover the original file F.Security of Duplicate Check Token: We consider a few sorts of security we need ensure, that is, i) unforgetability of copy check token: There are two sorts of enemies, that is, outside foe and interior enemy. As demonstrated as follows, the outer foe can be seen as an interior foe with no benefit. On the off chance that a client has benefit p, it requires that the enemy can't produce and yield a substantial copy token with whatever other benefit p on any record F, where p does not coordinate p. Moreover, it likewise requires that if the foe does not make a solicitation of token with its own benefit from private cloud server, it can't manufacture and yield a legitimate copy token with p on any F that has been questioned.

A. Secure Deduplication SystemsTo backing authorized deduplication, the tag of a file will be determined by the privilege p and the file F. To illustrate the variation in traditional notation of tag, we describe it as a file token as an alternative. To support an authorized admission, a furtive key kp will be surrounded with a privilege p to produce a file token. Let ߠ ′ F,p = TagGen(F, kp) denote the token of F that is only granted to access by user with privilege p. In an another term, the users with privilege p, and the token ߠ′ could only be computed. So the result is ( ,ܨ)produced, if the user has been uploaded a file with a duplicate token ߠ ′ F,p , then the duplicate check sent from one more user will be successful if he holds the privilege p and file F. H(F, kp) is the token generation function which can be simply applied.The goal of data deduplication is to lessen the requirement of storage space by sharing only one copy of the same plaintext or information.

B. DES AlgorithmThe DES algorithmic rule could be a basic building block for providing information security. It is a symmetric encryption system that uses 64-bit blocks, 8 bits (one octet) of that are used for parity checks (to verify the key's integrity). every of the key's parity bits (1 each eight bits) is employed to visualize one in every of the key's octets by odd parity, that is, every of the parity bits is adjusted to possess an odd variety of '1's within the octet it belongs to. The key thus incorporates a "useful" length of 56 bits, which suggests that solely 56 bits are literally employed in the algorithmic rule.

Page 4: DIFFERENTIAL AUTHORIZATION DUPLICATE CHECK WITH HYBRID ... · PDF fileDIFFERENTIAL AUTHORIZATION DUPLICATE CHECK WITH HYBRID CLOUD APPROACH ... In hybrid cloud design authorized duplicate

www.gjaet.com Page | 121

Global Journal of Advanced Engineering Technologies Volume 5, Issue 2- 2016 ISSN (Online): 2277-6370 & ISSN (Print):2394-0921

The algorithmic rule involves finishing up combinations, substitutions and permutations between the text to be encrypted and also the key, whereas ensuring the operations is performed in each directions (for decryption). the combinations of substitutions and permutations is named a product cipher.

Generation of keysGiven that the DES algorithmic rule given higher than is public, security is predicated on the quality of encryption keys. The algorithmic rule below shows a way to get, from a 64-bit key (made of any 64 alphanumerical characters), 8 totally different 48-bit keys every employed in the DES algorithm:

Firstly, the key's parity bits are eliminated therefore on get a key with a helpful length of 56-bits.

C. Advantages of Secure Authorized Deduplication1. Each authorized user is can to get their

individual token of their file to detect the duplication based on their privileges.

2. Authorized users are able to use their individual private keys to generate query for particular file and the privileges owned with the help of private cloud, while the public

cloud performs duplicate check directly and inform the user if there is any duplicate.

3. In our proposed system, compared to the existing definition of data confidentiality depends on convergent encryption; a higher level confidentiality is defined and achieved.

IV. EXPERIMENTAL RESULTSIn our experiments, administrator can add some employees into the system. After adding, login an individual user into the system. Here, Director login as a user and after he will upload the files , download the files and he must give some privileges to other users.

Figure 2: Above screen shows that the duplicate file message.

Figure 3: Above screen shows that if any duplicate files are occurs at the server it uses the already generated

token and tags to the existed file.

After logout of director, Team leader can login and he can get the files based on received privileges.

Page 5: DIFFERENTIAL AUTHORIZATION DUPLICATE CHECK WITH HYBRID ... · PDF fileDIFFERENTIAL AUTHORIZATION DUPLICATE CHECK WITH HYBRID CLOUD APPROACH ... In hybrid cloud design authorized duplicate

www.gjaet.com Page | 122

Global Journal of Advanced Engineering Technologies Volume 5, Issue 2- 2016 ISSN (Online): 2277-6370 & ISSN (Print):2394-0921

If he wants to upload any file into the cloud, it will display the alert like, duplicate file exists.

V. CONCLUSIONWe conclude that in this paper we proposed a novel secure authorized data deduplication system including some privileges. Through our proposed system we can perform the duplication check efficiently. For efficient duplicate detection we use hybrid cloud approach it would generate the duplicate check token of the files.

REFERENCES[1] Yan Kit Li, Xiaofeng Chen, Patrick P. C. Lee, Wenjing Lou Jin Li, "A Hybrid Cloud Approach for Secure Authorized," IEEE Transactions on Parallel and Distributed Systems, vol. pp, pp. 1-12, 2014.[2] S. Quinlan and S. Dorward., "Venti: a new approach to archival storage," USENIX FAST, Jan 2002. [3] A. Adya, W. J. Bolosky, D. Simon, and M. Theimer. J. R. Douceur, "Reclaiming space from duplicate files in a serverless distributed," ICDCS, pp. 617-624, 2002. [4] D. Harnik, B. Pinkas, and A. Shulman-Peleg. S. Halevi, "Proofs of ownership in remote storage systems.," ACM Conference on Computer and Communications Security, pp. 491-500, 2011. [5] Sriram Keelveedhi, Thomas Ristenpart Mihir Bellare, "Message-locked encryption and secure deduplication," in Springer Berlin Heidelberg,International Association for Cryptologic Research, Advances in Cryptology – EUROCRYPT 2013, Athens, Greece, March 2013, pp. 296- 312.[6] S. Bugiel, S. Nurnberger, A. Sadeghi, and T. Schneider, “Twin clouds: An architecture for secure cloud computing,” in Proc. Workshop Cryptography Security Clouds, 2011, pp. 32–44.[7] S. Bugiel, S. Nurnberger, A. Sadeghi, and T. Schneider. Twin clouds: An architecture for secure cloud computing. In Workshop on Cryptography and Security in Clouds (WCSC 2011), 2011.[8] J. R. Douceur, A. Adya, W. J. Bolosky, D. Simon, and M. Theimer. Reclaiming space from duplicate files in a serverless distributed file system. In ICDCS, pages 617– 624, 2002. [9] D. Ferraiolo and R. Kuhn. Role-based access controls. In 15th NIST-NCSC National Computer Security Conf., 1992. [10] S. Halevi, D. Harnik, B. Pinkas, and A. ShulmanPeleg. Proofs of ownership in remote storage systems. In Y. Chen, G. Danezis, and V. Shmatikov,

editors, ACM Conference on Computer and Communications Security, pages 491–500. ACM, 2011.