140 likes | 359 Views
Fine-Grained Access Control (FGAC) in the Cloud. Robert Barton. Access Control Quick Review. Fine-grained Why should I care? Why is access control necessary?. Clouds. Shift to corporate data storage by third parties More cost effective Poses problems with data security.
E N D
Fine-Grained Access Control (FGAC) in the Cloud Robert Barton
Access Control Quick Review • Fine-grained • Why should I care? • Why is access control necessary?
Clouds • Shift to corporate data storage by third parties • More cost effective • Poses problems with data security
Issues with Cloud Storage • Data Security • User Revocation • Scalability
Data Security • It is necessary to keep the data private from the third party • There is no clear solution to scalable FGAC but there are many good systems to start from
Data Security:Key Policy Attribute-Based Encryption • Users given secret keys based on sets of attributes • Includes one dummy attribute that every file is encrypted with and every user has but cloud does not know about • Files encrypted using the keys of the attributes such that a user that has all the attributes will be able to decrypt the file • Easy to deal with user revocation • Easy for the cloud server to learn about users
Data Security:Hierarchical Identity-Based Encryption • Each user has a public key and secret key • Secret key is made to decrypt any file encrypted using its paired public key along with all the public keys of the user’s ancestors • Easy for third parties to learn about file security levels
Cloud Knowledge • It’s safe to assume that the cloud will try to get as much knowledge about the data it’s storing • One proposed solution: chunks • Each data owner has their own chunk that contains all their files on the cloud • Cloud doesn’t know individual file access policies • If a user satisfies one of the access policies of the chunk he downloads the whole chunk
Data Chunks • Each data owner has their own chunk that contains all their files on the cloud • Cloud doesn’t know individual file access policies • If a user satisfies one of the access policies of the chunk he downloads the whole chunk
User Revocation • Each file the user had access to needs to be re-encrypted • Severe computational overhead on the data owner • Two good solutions: • Two-Layered Encryption • Proxy Re-Encryption • These systems have the larger resources of the cloud server do all the work • The only work done by the data owner is the updated key delegation
User Revocation:Two-Layered Encryption • Data owner encrypts data then has the cloud encrypt a second time • When a user is removed the data owner has the cloud server decrypt the second layer then re-encrypt with a different encryption
User Revocation:Proxy Re-Encryption • This method has the third party re-encrypt the already encrypted data to create a new encryption • The third party doesn’t get to see that data decrypted so it never learns anything
Lazy Re-Encryption • Files are not re-encrypted until a user wants access • Spreads out the re-encryption over time to speed up access with the third party
Conclusion • There is no perfect or correct solution to these problems • It is a continuing academic and industry research area