准备完全分布式主机的ssh
1.删除所有主机上的/home/centos/.ssh/*
Cd ~/.ssh
Rm -rf *
2.在s200主机上生成密钥对
$>ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa
3.将s200的公钥文件id_rsa.pub远程复制到201 ~ 203主机上。
并放置/home/centos/.ssh/authorized_keys
$>scp id_rsa.pub centos@mini_200:/home/centos/.ssh/authorized_keys
$>scp id_rsa.pub centos@mini_201:/home/centos/.ssh/authorized_keys
$>scp id_rsa.pub centos@mini_202:/home/centos/.ssh/authorized_keys
$>scp id_rsa.pub centos@mini_203:/home/centos/.ssh/authorized_keys
4.配置完全分布式($/etc/hadoop/)
[core-site.xml]
fs.defaultFS
hdfs://s250/
[hdfs-site.xml]
dfs.replication
3
[mapred-site.xml]
不变
[yarn-site.xml]
yarn.resourcemanager.hostname
s250
yarn.nodemanager.aux-services
mapreduce_shuffle
[slaves]
s251
s252
s253
[hadoop-env.sh]
...
export JAVA_HOME=/soft/jdk
...
5.分发配置
$>cd /soft/hadoop/etc/
$>scp -r full centos@s251:/soft/hadoop/etc/
$>scp -r full centos@s252:/soft/hadoop/etc/
$>scp -r full centos@s253:/soft/hadoop/etc/
6.删除符号连接
$>cd /soft/hadoop/etc
$>rm hadoop
$>ssh s251 rm /soft/hadoop/etc/hadoop
$>ssh s252 rm /soft/hadoop/etc/hadoop
$>ssh s253 rm /soft/hadoop/etc/hadoop
7.创建符号连接
$>cd /soft/hadoop/etc/
$>ln -s full hadoop
$>ssh s251 ln -s /soft/hadoop/etc/full /soft/hadoop/etc/hadoop
$>ssh s252 ln -s /soft/hadoop/etc/full /soft/hadoop/etc/hadoop
$>ssh s253 ln -s /soft/hadoop/etc/full /soft/hadoop/etc/hadoop
8.删除临时目录文件
$>sudo rm -rf *
$>ssh s251 rm -rf /tmp/*
$>ssh s252 rm -rf /tmp/*
$>ssh s253 rm -rf /tmp/*
9.删除hadoop日志
$>cd /soft/hadoop/logs
$>rm -rf *
$>ssh s251 rm -rf /soft/hadoop/logs/*
$>ssh s252 rm -rf /soft/hadoop/logs/*
$>ssh s253 rm -rf /soft/hadoop/logs/*
10.格式化文件系统
$>hadoop namenode -format
11.启动hadoop进程
$>start-all.sh
领取专属 10元无门槛券
私享最新 技术干货