操作的根目录为D:hadoop
解压Hadoop 2.7.1和hadooponwindows-master,然后复制hadoopwindows-master下的文件到Hadoop-2.7.1下,如有冲突选择覆盖。
修改conf/hadoop/core-sit.xml文件
<configuration>
<property>
<name>hadoop.tmp.dir</name>
<value>/D:/hadoop/data/tmp</value>
</property>
<property>
<name>dfs.name.dir</name>
<value>/D:/hadoop/data/name</value>
</property>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>
修改conf/hadoop/hdfs-site.xml文件
<configuration>
<!-- 这个参数设置为1,因为是单机版hadoop -->
<property>
<name>dfs.replication</name>
<value>1</value>
</property>
<property>
<name>dfs.data.dir</name>
<value>/D:/hadoop/data/data</value>
</property>
</configuration>
格式化名称节点bin/hadoop namenode -format
启动sbin/start-all
This script is Deprecated. Instead use start-dfs.cmd and start-yarn.cmd
Error: JAVA_HOME is incorrectly set.
Please update D:hadoophadoop-2.7.1confhadoop-env.cmd
修改etc/hadoop/hadoop-env.cmd文件,修改JAVA_HOME变量为正确的JDK安装目录