37
© 2014 MapR Technologies 1 © 2014 MapR Technologies Securing Hadoop Keys Botzum, Senior Principal Technologist March 2014

Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

Embed Size (px)

DESCRIPTION

Historically, security hasn't been a high priority in regards to Hadoop (reflection of type of data and organizations using Hadoop), but now Hadoop is being used by more traditional firms with heightened security requirements. MapR's Senior Principal Technologist, Keys Botzum, gives a talk on how you can build a more secure cluster.

Citation preview

Page 1: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 1© 2014 MapR Technologies

Securing Hadoop

Keys Botzum, Senior Principal Technologist

March 2014

Page 2: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 2

Agenda

• What’s MapR

• Why Secure Hadoop

• Securing MapR Hadoop

• Security beyond the core

Page 3: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 3

MapR Data Platform

Ma

na

gem

en

t

MapR Data Platform MAPR-DBMAPR-FS

APACHE HADOOP AND OSS ECOSYSTEM

Hue ...SharkImpalaDrillHive/

Stinger/Tez

Sqoop

Storm SentrySparkSolrCascadingMahoutFlume

Oozie HBaseMapReduceYARNPigWhirrZookeeper

MapR Data Platform TABLESFILES MapR Data Platform MAPR-DBMAPR-FSPatentPending

Enterprise-grade Security OperationalPerformance

• High availability • Data protection• Disaster recovery

• Standard file access• Standard database

access• Pluggable services• Broad developer

support

• Enterprise securityauthorization

• Wire-level authentication

• Data governance

• Ability to support predictive analytics, real-time database operations, and support high arrival rate data

• Ability to logically divide a cluster to support different use cases, job types, user groups, and administrators

• Ability to deliver 2X to 7X performance

• Consistent low latency

Multi-tenancy Inter-operability

MapR Distribution for Hadoop

Page 4: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 4

The Cloud Leaders Pick MapR

Google chose MapR to provide Hadoop on Google

Compute Engine

Amazon EMR is the largest Hadoop provider in revenue

and # of clusters

Page 5: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 5

Why Secure Hadoop Now?

• Historically security wasn’t a high priority– Reflection of the type of data and the type of organizations using Hadoop

• Hadoop is now being used by more traditional firms as well as organizations with high security requirements– Highly regulated– Sensitive data sets– People with experience with security in existing enterprise technologies

(e.g., databases) are asking for the same in Hadoop

• Think for a moment and imagine the value of the data in a Hadoop cluster used as a data lake– Much valuable operational data about your customers, systems, sales,

etc.

Page 6: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 6

Typical Hadoop Deployment Weaknesses

• Client operating system is trusted to identify user (weak authentication)– If I can compromise client, I can run jobs or access HDFS as anyone– Think about virtual machines with root access

• Hadoop servers trust anyone that can reach them on the network– Could I falsify a data node, job tracker, etc.?

• Hive Server runs as ‘system’ user– All Hive Server submitted jobs run as that ‘system’ user

• Intruders can see and modify all network traffic

Page 7: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 7

Agenda

• What’s MapR

• Why Secure Hadoop

• Securing MapR Hadoop

• Security beyond the core

Page 8: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 8

MapR 3.1: Securing MapR Hadoop

• Core goals– Authenticate network traffic

• Users authenticate• Servers authenticate to each other

– Encrypt network traffic– Authorization

• Integrate with existing authorization functionality • Enhance MapR Tables authorization with fine grained controls

– Low barrier to entry• Low performance overhead• Simple and easy to administer• Support, but do not require Kerberos

– Leverage Apache Hadoop functionality

Page 9: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 9

MapR Native Security

• Hadoop security without Kerberos– But borrows heavily from Kerberos design

• Kerberos integration if desired

Page 10: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 10

Architecture

• Shared secrets like Kerberos– Managed at cluster level– Two shared keys: cldb key and server key

• Identity represented using a ticket which is issued by MapR CLDB servers (Container Location DataBase)

Page 11: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 11

Tickets

• A ticket represents a valid authenticated identity• Contains

– An expiration time, renewal lifetime, and creation time– A randomly generated secret key– Information about the identity – userid, group ids

• Signed and encrypted when issued by CLDB– CLDB key used for ‘permanent’ server tickets– Server key used for ephemeral tickets issued for users

• A client authenticates to trusted servers using the ticket

Page 12: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 12

User Experience

• User invokes maprlogin– maprlogin connects to CLDB (over https)

• Provide userid & password (or Kerberos ticket) for validation by CLDB

– Ticket is returned, saved in file in /tmp file and accessible only by owning user – file name is /tmp/maprticket_<uid>

• MapR PAM module– Optional MapR provided PAM module creates MapR tickets automatically

during Unix login

• All processes automatically pick up ticket (nothing to do)– Java and C/C++ clients implicitly look for valid ticket and use it– Clients optionally use existing Kerberos identity to get MapR ticket

Page 13: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 13

Maprlogin

• Primary user visible security tool• Actions are

– password - authenticate to a MapR cluster using a valid password– kerberos - authenticate to a MapR cluster using Kerberos– print - print information on your existing credentials– authtest - test authentication as a generic client– end / logout - logout of cluster– renew - renew existing ticket

• User information is obtained using PAM and Linux pwent APIs– Fully pluggable– MapR can authenticate using any registry that is PAM enabled and gets user

information via Unix APIs which are NSSwitch controlled• Basically, if it works with Linux authentication, it should work with MapR

Page 14: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 14

CLI Example$ hadoop fs -ls /

Bad connection to FS. command aborted. exception: failure to login: Unable to obtain MapR credentials

$ maprlogin password[Password for user 'fred' at cluster 'my.cluster.com': ] MapR credentials of user 'fred' for cluster 'my.cluster.com' are written to‘/tmp/maprticket_1001'

$ hadoop fs -ls /Found 3 items-rwxr-xr-x 3 mapr mapr 0 2013-12-10 13:25 /hbasedrwxr-xr-x - mapr mapr 1 2013-12-10 13:25 /userdrwxr-xr-x - mapr mapr 1 2013-12-10 13:25 /var

Page 15: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 15

Maprlogin – Under the Covers

maprlogin MapRCLDB

1. username/passwdsent on https LDAP/

Kerberos/NIS

2. uses PAM toauthenticate

3. ticket + user key returned

FileServer/CLDB

4. ticket + key saved in file in /tmp

hadoop fs –ls /

5. cmd picks up ticket + key from file

6. client sends RPC encrypted with user-key + ticket

7. server decrypts ticket to authenticate user and checks permissions on ACL

Page 16: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 16

Client First Contact

• Client sends the ticket and data encrypted using secret key• Receiving server

– Extracts and decrypts ticket to obtain secret key– Checks expiration– Uses the secret key to decrypt the data

• This proves that the client possesses the key that corresponds to the ticket

– Extracts identity information from ticket and uses that for authorization– Returns encrypted response to client

• MapR user identity is independent of host or operating system identity

Page 17: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 17

Server First Contact

• When a trusted server starts it uses a local server ticket to authenticate to the CLDB– CLDB verifies the ticket’s authenticity using secret key– CLDB returns the server key that is used to create and validate user

tickets– The server is now a trusted member of the cluster

Page 18: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 18

Component Security• Security between MapR unique components (CLDB, file server, etc.) is

handled via changes to the MapR RPC layer• Apache components support pluggable security mechanisms – typically

SASL– We are providing a new mechanism called ‘maprsasl’– maprsasl secures communication following the same techniques as the MapR

RPC layer

• Existing authorization code simply leverages the securely authenticated identity– File access– Job submission– Queue ACLs– And so on …

Page 19: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 19

Example: Job Tracker Integration

JT can create user tickets. TT copies ticket to private job directory on local disk. taskcontroller copies it to user private local disk dir and tasks set MAPR_TICKET_LOCATION to that place.

JobClient JobTracker TaskTracker

submit job (maprsasl)

schedule job(maprsasl)

File system

1. JC copies job conf securely to FS 4. TT launches job using ticket identity

3. TT fetches ticket

2. JT creates user ticket

Page 20: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 20

Out of the Box Defaults

• User experience– Users authenticate using maprlogin and passwords– User ‘mapr’ is admin as always

• User must authenticate however, OS identity irrelevant

– Operating system identity (on or off cluster) no longer relevant to MapR security

• Obviously root user and ‘mapr’ user can read/write /opt/mapr• We’ve also tightened permissions for many directories under /opt/mapr

– Web UIs require authentication– MapR CLIs require authentication

• hadoop fs/mfs/jar/job/etc• maprcli

– Any user can submit jobs, but can only admin their own jobs

Page 21: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 21

Out of the Box Defaults

• Cluster operations– All MapR servers authenticate to each other

• Most communication paths encrypted

– All nodes share common maprserverticket• Nodes can only join cluster if they have maprserverticket

– Self-signed wildcard certificates created for HTTPS traffic• ssl_keystore contains certificate and private key, ssl_truststore contains

certificate– We set JVM system property: javax.net.ssl.trustStore

• Used by Web UIs, MCS, and maprlogin to CLDB• Uses hostname command to get DNS domain for cluster and put that into

certificate

Page 22: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 22

Cryptography

• Encrypted using current NIST standards– AES-256 in GCM mode for encryption and signing

• http://en.wikipedia.org/wiki/Galois/Counter_Mode• NIST standard -

http://csrc.nist.gov/publications/fips/fips140-2/fips1402annexa.pdf

– Leverage Intel hardware encryption where available, software otherwise

• Use the open source crypto++ library for our C++ cryptography – http://cryptopp.com

• Random number generation– Use secure random number generation as documented here http

://www.cryptopp.com/docs/ref/class_auto_seeded_random_pool.html#_details

Page 23: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 23

Let’s Build a Secure Cluster!Node 1

apt-get install mapr….configure.sh –C … -Z … -secure –genkeys

– Generates all needed keys for MapR-RPC as well as for HTTPSNode N

apt-get install mapr….scp rootORmapr@node1:/opt/mapr/conf/{cldb.key,maprserverticket,ssl_keystore,ssl_truststore} /opt/mapr/confconfigure.sh –C … -Z … -secure

Clientsapt-get install mapr…scp anyuser@nodeN:/opt/mapr/conf/ssl_truststore /opt/mapr/confconfigure.sh … -secure

Page 24: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 24

Kerberos

• Not required but can use• Kerberos SSO

– Explicitly using ‘maprlogin kerberos’– Implicitly

• If no MapR ticket available, client automatically detects and uses Kerberos ticket and uses it to obtain MapR ticket

• Kerberos SSO requires only– Kerberos client on CLDB and client machines– Kerberos identity only for CLDB – typically 3-5 CLDBs

• No need to manage identities for every node

Page 25: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 25

Agenda

• What’s MapR

• Why Secure Hadoop

• Securing MapR Hadoop

• Security beyond the core

Page 26: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 26

Hadoop Map Reduce Clients

• Many components simply generate Map Reduce jobs. As such they implicitly leverage the security we’ve defined for Map Reduce previously. They are:– Hive (except Hive Server)– Pig– Mahout– Sqoop

Page 27: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 27

Ecosystem Security

• All ecosystem components run securely as well in a secure MapR cluster– Some by default– Some with minor configuration

• Most Web UIs enhanced to use userid & password authentication and HTTPS– Can configure Kerberos SPNEGO, same as from Apache

Page 28: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 28

MapR Ecosystem Security – by Default

• By default, out of the box when security enabled– Hive Server 2 supports password authentication

• Can configure Kerberos and SSL function, same as from Apache, including secure impersonation

– Oozie supports MapR ticket authentication• Can configure Kerberos and SSL function, same as from Apache, including

secure impersonation

• HBase and Hive MetaServer require Kerberos to be secured• MapR Tables (HBase APIs) use native MapR security, no

configuration needed

Page 29: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 29

MapR Tables Authorization

• boolean logic constraints on access to M7 tables– Uses user & group information– Very powerful

• ( u:bob | g:admins)• ( g:managers & ! g:restricted)• ( g: managers & g:businessunity) | g:executives

– Settable at table, column, and column family level for various actions– Queries silently hide data you are not authorized to see

Page 30: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 30

MapR Hadoop Advantage

• Vastly simpler– Core secured by default in one step– No requirement for Kerberos in core and associated complexity

• Easier integration– Leverage existing Linux authentication (PAM and NSSwitch)

• Faster – Leverage Intel AES hardware cryptography

Page 32: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 32

Q & A

@mapr maprtech

[email protected]

Engage with us!

MapR

maprtech

mapr-technologies

Page 33: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 33© 2014 MapR Technologies

Appendix

Page 34: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 34

Encrypted Shuffle (?)

• No need to special case encrypting shuffle• MapR-FS is store for Map output

– Shuffle inherits the same encryption, authentication, and authorization functionality of the rest of MapR-FS

Page 35: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 35

Persistent Keys and Tickets

CLDB/ZK 1

K

CLDB/ZK N

K

Node 1 Node 2 Node N

Page 36: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 36

Apache Hadoop Security

• Kerberos as core authentication technology– Kerberos to access HDFS, JT, Oozie, etc.– Kerberos for server to server traffic

• But Kerberos doesn’t fit perfectly with Hadoop model– Introduce delegation tokens for carrying identity in many scenarios

• Kerberos is complicated– Need Kerberos identity for every server in the cluster

• Lots to manage!

– Every user needs a Kerberos identity to access cluster, Web UIs, etc.– Lots of steps

• http://www.cloudera.com/content/cloudera-content/cloudera-docs/CDH4/4.3.0/CDH4-Security-Guide/cdh4sg_topic_3.html

Page 37: Securing Hadoop by MapR's Senior Principal Technologist Keys Botzum

© 2014 MapR Technologies 37

Key Design Elements

• User authentication and authorization information obtained using standard operating system information – PAM and nsswitch

• MapR specific shared secret keys– Easier to manage– No dependencies on complex external security systems– Better performance

• MapR servers (running as ‘mapr’) have access to maprserverticket and are therefore privileged processes

• MapR-RPC altered to encrypt and authenticate traffic• Maprsasl created for Apache Java code to leverage similar security

– Leverages same keys, authentication model, etc.– Reuses the C/C++ code via JNI