1.hdfs三个进程分别为namenode、datanode、secondarynamenode hdfs是hadoop的一个组件,hadoop一共有3个组件分别为hdfs,yarn,mapreduce,这三个组件的功能分别为存储,调度,计算
hdfs三个进程要以hadoop002启动需要配置文件,步骤如下:
etc/hadoopconf
[hadoop@hadoop001 hadoop]$ pwd/home/hadoop/app/hadoop-2.6.0-cdh5.7.0/etc/hadoop
[hadoop@hadoop002 hadoop]$ lltotal 140-rw-r--r-- 1 hadoop hadoop 884 Feb 13 22:34 core-site.xml 核心的配置文件把hdfs mapreduce yarn 一些核心的共有的提供出来-rw-r--r-- 1 hadoop hadoop 4294 Feb 13 22:30 hadoop-env.sh JDK目录 hadoop家目录
-rw-r--r-- 1 hadoop hadoop 867 Feb 13 22:34 hdfs-site.xml
-rw-r--r-- 1 hadoop hadoop 11291 Mar 24 2016 log4j.properties
-rw-r--r-- 1 hadoop hadoop 758 Mar 24 2016 mapred-site.xml.template-rw-r--r-- 1 hadoop hadoop 10 Mar 24 2016 slaves
-rw-r--r-- 1 hadoop hadoop 690 Mar 24 2016 yarn-site.xml
后缀为cmd的文件是window部署此处已经删除[hadoop@hadoop002 hadoop]$
hadoop的三个组件分别为 hdfs mapreduce yarn
生产 学习: 不用ip部署,统一机器名称hostname部署只需要/etc/hosts 修改即可(第一行 第二行不要删除)因为一旦网段地址发生改变重新部署很麻烦,如果用机器名称部署,只需要在hosts文件里做好ip和机器名称的映射即可
namenode进程:[hadoop@hadoop002 hadoop]$ vi core-site.xml
<configuration> <property> <name>fs.defaultFS</name> <value>hdfs://hadoop002:9000</value> 此处红色字体是指机器名称 </property></configuration>
datanode进程:[hadoop@hadoop002 hadoop]$ vi slaves slaves里的内容是启动datanode的机器名称 后边还可以有很多歌机器名称用逗号隔开,如红色字体hadoop002,hadoop001,hadoop003
secondarynamenode进程:
[hadoop@hadoop001 hadoop]$ vi hdfs-site.xml<property> <name>dfs.namenode.secondary.http-address</name> 这里的红色字体是从官网上查看的http://hadoop.apache.org/docs/stable/hadoop-project-dist/hadoop-hdfs/hdfs- default.xml (在浏览器中输入hadoop.apache.org,在最左侧一栏找到hdfs-default.xml
点击进去搜索secondary 对应着名称就找到了) <value>hadoop002:50090</value></property><property> <name>dfs.namenode.secondary.https-address</name> <value>hadoop002:50091</value></property>
[hadoop@hadoop002 hadoop-2.6.0-cdh5.7.0]$ [hadoop@hadoop002 hadoop-2.6.0-cdh5.7.0]$ cd[hadoop@hadoop002 ~]$ [hadoop@hadoop002 ~]$ [hadoop@hadoop002 ~]$ [hadoop@hadoop002 ~]$ lltotal 8drwxrwxr-x 3 hadoop hadoop 4096 Feb 13 22:21 appdrwxrwxr-x 8 hadoop hadoop 4096 Feb 13 20:46 d5[hadoop@hadoop002 ~]$ ll -atotal 68drwx------ 7 hadoop hadoop 4096 Feb 16 20:25 .drwxr-xr-x. 5 root root 4096 Oct 22 10:45 ..drwxrwxr-x 3 hadoop hadoop 4096 Feb 13 22:21 app-rw------- 1 hadoop hadoop 15167 Feb 13 23:29 .bash_history-rw-r--r-- 1 hadoop hadoop 18 Mar 23 2017 .bash_logout-rw-r--r-- 1 hadoop hadoop 293 Sep 19 23:22 .bash_profile-rw-r--r-- 1 hadoop hadoop 124 Mar 23 2017 .bashrcdrwxrwxr-x 8 hadoop hadoop 4096 Feb 13 20:46 d5drwxrw---- 3 hadoop hadoop 4096 Sep 19 17:01 .pkidrwx------ 2 hadoop hadoop 4096 Feb 13 22:39 .sshdrwxr-xr-x 2 hadoop hadoop 4096 Oct 14 20:57 .vim-rw------- 1 hadoop hadoop 8995 Feb 16 20:25 .viminfo[hadoop@hadoop002 ~]$ ll .sshtotal 16-rw------- 1 hadoop hadoop 398 Feb 13 22:37 authorized_keys-rw------- 1 hadoop hadoop 1675 Feb 13 22:36 id_rsa-rw-r--r-- 1 hadoop hadoop 398 Feb 13 22:36 id_rsa.pub-rw-r--r-- 1 hadoop hadoop 780 Feb 13 22:49 known_hosts
转载于:https://www.cnblogs.com/xuziyu/p/10403353.html