Hadoop集群包含1个主节点和3个从节点,需要实现各节点之间的免密码登录,下面介绍具体的实现方法。
一、Hadoop集群环境
二、免密登录原理
每台主机authorized_keys文件里面包含的主机(ssh密钥),该主机都能无密码登录,所以只要每台主机的authorized_keys文件里面都放入其他主机(需要无密码登录的主机)的ssh密钥就行了。
三、实现方法
1. 配置每个节点的hosts文件
#vim /etc/hosts
1 192.168.44.3 hadoop01
2 192.168.44.4 hadoop02
3 192.168.44.5 hadoop03
4 192.168.44.6 hadoop04
2. 每个节点生成ssh密钥
[root@hadoop01 ~]# ssh-keygen -t rsa
Generating public/private rsa key pair.
Enter file in which to save the key (/root/.ssh/id_rsa):
Created directory ‘/root/.ssh’.
Enter passphrase (empty for no passphrase):
Enter same passphrase again:
Your identification has been saved in /root/.ssh/id_rsa.
Your public key has been saved in /root/.ssh/id_rsa.pub.
…………………
[root@hadoop01 .ssh]# ls
id_rsa id_rsa.pub
执行命令后会在~目录下生成.ssh文件夹,里面包含id_rsa和id_rsa.pub两个文件。
注:使用ssh-keygen -t rsa -P ” -f ~/.ssh/id_rsa命令可避免上述交互式操作。
3. 在主节点上将公钥拷到一个特定文件authorized_keys中。
[root@hadoop01 ~]# cd .ssh
[root@hadoop01 .ssh]# ls
id_rsa id_rsa.pub
[root@hadoop01 .ssh]# cp id_rsa.pub authorized_keys
[root@hadoop01 .ssh]# ls
authorized_keys id_rsa id_rsa.pub
4. 将authorized_keys文件拷到下一个节点,并将该节点的ssh密钥id_rsa.pub加入该文件中。
#在hadoop01上使用scp命令实现远程文件拷贝
[root@hadoop01 .ssh]# scp authorized_keys root@hadoop02:/root/.ssh/
The authenticity of host ‘hadoop02 (192.168.44.11)’ can’t be established.
ECDSA key fingerprint is SHA256:MyB1zs0E3J/fm8pC0AN8ycsgEIBNHtUqd9xS0WAyv3s.
ECDSA key fingerprint is MD5:88:48:3a:ba:3e:14:a7:d7:86:f6:51:74:00:10:f9:00.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added ‘hadoop02,192.168.44.11’ (ECDSA) to the list of known hosts.
root@hadoop02’s password:
authorized_keys 100% 395 306.2KB/s 00:00
#登录hadoop02主机
[root@hadoop02 ~]# cd .ssh/
[root@hadoop02 .ssh]# ls
authorized_keys id_rsa id_rsa.pub
[root@hadoop02 .ssh]# cat id_rsa.pub >> authorized_keys #使用cat追加方式
5. 重复第4步的操作,依次将hadoop03、hadoop04节点的ssh密钥加入到authorized_keys文件中,并将hadoop04节点生成的authorized_keys文件拷贝到其他三个节点(hadoop01、hadoop02、hadoop03)即可。
#登录hadoop03主机,将ssh密钥加入authorized_keys文件中
[root@hadoop03 .ssh]# cat id_rsa.pub >> authorized_keys
[root@hadoop03 .ssh]# scp authorized_keys root@hadoop04:/root/.ssh/
#登录hadoop04主机,将ssh密钥加入authorized_keys文件中
[root@hadoop04 .ssh]# cat id_rsa.pub >> authorized_keys
#将最后生成的authorized_keys文件分别拷贝到hadoop01、hadoop02和hadoop03
[root@hadoop04 .ssh]# scp authorized_keys root@hadoop01:/root/.ssh/
[root@hadoop04 .ssh]# scp authorized_keys root@hadoop02:/root/.ssh/
[root@hadoop04 .ssh]# scp authorized_keys root@hadoop03:/root/.ssh/
6. 验证免密登录
使用ssh 用户名@节点名或ssh ip地址命令验证免密码登录。
[root@hadoop01 .ssh]# ssh root@hadoop02
Last login: Tue Feb 12 03:59:46 2019 from 192.168.44.1
[root@hadoop02 .ssh]# ssh root@hadoop01
Last login: Tue Feb 12 21:27:24 2019 from hadoop04
[root@hadoop03 .ssh]# ssh root@hadoop04
Last login: Tue Feb 12 04:00:47 2019 from 192.168.44.1
[root@hadoop04 .ssh]# ssh root@hadoop01
Last login: Tue Feb 12 21:26:44 2019 from hadoop02
在 Linux Ubuntu 18.04/18.10上安装Hadoop图文详解 http://www.info110.com/Linux/2018-11/155282.htm
CentOS 7 下搭建Hadoop 2.9 分布式集群 http://www.info110.com/Linux/2018-11/155328.htm