190 likes | 333 Views
* All opinions and information are mine and do not represent the view(S) of my employer. Cloud Computing: hadoop Security Design - 2009. Kaveh Noorbakhsh Kent State: CS. Owen O’Malley | Kan Zhang | Sanjay Radia | Ram Marti | Christopher Harrell | Yahoo ! .
E N D
*All opinions and information are mine and do not represent the view(S) of my employer Cloud Computing: hadoop Security Design-2009 Kaveh Noorbakhsh Kent State: CS Owen O’Malley | KanZhang| Sanjay Radia| Ram Marti| Christopher Harrell | Yahoo!
Map/Reduce An Introduction Map/Reduce allows computation to scale out over many “cheap” systems rather than one expensive super computer
Divide and Conquer “Work” Partition w1 w2 w3 “worker” “worker” “worker” r1 r2 r3 Combine “Result”
Two Layers MapReduce: Code runs here HDFS: Data lives here
Advantages of the Cloud Database as a Service = DBaaS Infrastructure as a Service = Iaas Software as a Service = SaaS Platform as a Service = PaaS • Share hardware and energy costs • Share employee costs • Fast spin-up and tear down • Expand quickly to meet demands • Costs ideally proportional to usage • Scalability
Cloud Services Spending Billions of Dollars
Cloud vs Total IT Spending Billions of Dollars
Security Challenges of the Cloud • Where is my data living? • You may not know where you data is exactly since the data can be distributed among many physical disks • Where is my data going? • In the cloud, especially in map/reduce, data is constantly in moving from node to node and nodes may be across multiple mini-clouds • Who has access to my data? • There may be other clients using the cloud, as well as, administrators and others who maintained the cloud that could have access to the data if it is not properly protected.
Hadoop Security Concerns • Hadoop services do not authenticate users or other services. • (a) A user can access an HDFS or MapReduce cluster as any other user. This makes it impossible to enforce access control in an uncooperative environment. For example, file permission checking on HDFS can be easily circumvented. • (b) An attacker can masquerade as Hadoop services. For example, user code running on a MapReduce cluster can register itself as a new TaskTracker. • DataNodes do not enforce any access control on accesses to its data blocks. This makes it possible for an unauthorized client to read a data block as long as she can supply its block ID. It’s also possible for anyone to write arbitrary data blocks to DataNodes.
Security Requirements for Hadoop • Users are only allowed to access HDFS files that they have permission to access. • Users are only allowed to access or modify their own MapReduce jobs. • User to service mutual authentication to prevent unauthorized NameN- odes, DataNodes, JobTrackers, or TaskTrackers. • Service to service mutual authentication to prevent unauthorized services from joining a cluster’s HDFS or MapReduce service. • The degradation of performance should be no more than 3%.
Proposed Solution – Use Case 1 Accessing Data 1) User/App requests access to a data block. 2) Name Node authenticates and gives the user a block token. 3) User/App uses block token on Data Node to access block for READ, WRITE, COPY or REPLACE.
Proposed Solution – Use Case 2 Submitting Jobs 1) A user may obtain a delegation token through Kerberos. 2) Token given to user jobs for subsequent authentication to NameNode as the user. 3) Jobs can use the delegation token to access data that user/app has access to
Core Principles Analysis Confidentiality Analysis • Users/Apps will only have access to the data blocks they should have via block tokens • Pass
Core Principles Analysis Integrity Analysis • Data is only available at the block level if the block token matches. • There is an assumption that the data is good because the blocks are not checked • Pass • Fail
Core Principles Analysis Availability Analysis • Job Tracker and Name Nodes are single points of failure for system. • Tokens persist for a small period of time so the system is resilient to short outages of Name Node and Job Tracker • Fail • Pass
Conclusion • The token method for authentication for both data and process access makes sense in a highly distributed system like hadoop. However, the fact that tokens have so much power and are not constantly re-checked leaves this design open to very serious TOCTOU attacks. • As compared to the currently model(aka no security) this represents a major step forward.
The End Questions?