-
Notifications
You must be signed in to change notification settings - Fork 3
/
Copy pathhadoop_server_requires_kerberos_authentication
102 lines (66 loc) · 3.14 KB
/
hadoop_server_requires_kerberos_authentication
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
.. _hadoop_server_requires_kerberos_authentication:
*********************************************
Hadoop Requires Kerberos Authentication
*********************************************
If your Hadoop server requires Kerberos authentication
Run kadmin -p root/[email protected] and create a pricipal for sqream user:If you don’t know kerberos root credentials, connect to kerberos server as root with ssh and run kadmin.local - it will not ask you for a password.
addprinc [email protected]
###IF it didn't ask for password run
change_password [email protected]
Connect to hadoop name node with ssh
cd /var/run/cloudera-scm-agent/process
Check the latest one
ls -lrt
Check out the name of the recently updated folder that looks like <number>-hdfs-<something>
cd <number>-hdfs-<something>
In this folder you should be able to see hdfs.keytab file or some other .keytab file
Copy the keytab to home directory of a sqream user on remote machines that you are planning to work with Hadoop from.
Copy core-site.xml and hdfs-site.xml to sqream sqream@server:<sqream folder>/hdfs/hadoop/etc/hadoop
Connect to sqream server and make sure that keytab file is owned by sqream user and has correct permissions:
sudo chown sqream:sqream /home/sqream/hdfs.keytab
sudo chmod 600 /home/sqream/hdfs.keytab
On sqream server, under sqream user, in home directory check the name of a kerberos principal that this keytab represents:
klist -kt hdfs.keytab
You suppose to get something similar to this:
sqream@Host-121 ~ $ klist -kt hdfs.keytab
Keytab name: FILE:hdfs.keytab
KVNO Timestamp Principal
---- ------------------- ------------------------------------------------------
5 09/15/2020 18:03:05 HTTP/[email protected]
5 09/15/2020 18:03:05 HTTP/[email protected]
5 09/15/2020 18:03:05 HTTP/[email protected]
5 09/15/2020 18:03:05 HTTP/[email protected]
5 09/15/2020 18:03:05 HTTP/[email protected]
5 09/15/2020 18:03:05 HTTP/[email protected]
5 09/15/2020 18:03:05 HTTP/[email protected]
5 09/15/2020 18:03:05 HTTP/[email protected]
5 09/15/2020 18:03:05 hdfs/[email protected]
5 09/15/2020 18:03:05 hdfs/[email protected]
5 09/15/2020 18:03:05 hdfs/[email protected]
5 09/15/2020 18:03:05 hdfs/[email protected]
5 09/15/2020 18:03:05 hdfs/[email protected]
5 09/15/2020 18:03:05 hdfs/[email protected]
5 09/15/2020 18:03:05 hdfs/[email protected]
5 09/15/2020 18:03:05 hdfs/[email protected]
12. See that the hdfs service we have named ”hdfs/[email protected]” Now run the following:
kinit -kt hdfs.keytab hdfs/[email protected]
13. Run klist command to check the result. You should see something like this:
Ticket cache: FILE:/tmp/krb5cc_1000
Default principal: [email protected]
Valid starting Expires Service principal
09/16/2020 13:44:18 09/17/2020 13:44:18 krbtgt/[email protected]
14. Run:
hadoop fs -ls hdfs://<hadoop server name or ip>:8020/
If you get the list ,continue to the next step.
If not
15. Troubleshooting
Make sure your environment is set correctly. If any of these are empty - you missed step #2
echo $JAVA_HOME
echo $SQREAM_HOME
echo $CLASSPATH
echo $HADOOP_COMMON_LIB_NATIVE_DIR
echo $LD_LIBRARY_PATH
echo $PATH
Make sure you copied correct keytab file
Go through this guide again, see if you missed any steps
Cry for help