I have downloaded hadoop source code from github and compiled with the native option:
mvn package -Pdist,native -DskipTests -Dtar -Dmaven.javadoc.skip=true
I then copied the .dylib files to the $HADOOP_HOME/lib
cp -p hadoop-common-project/hadoop-common/target/hadoop-common-2.7.1/lib/native/*.dylib /usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/lib
The LD_LIBRARY_PATH was updated and hdfs restarted:
echo $LD_LIBRARY_PATH
/usr/local/Cellar/hadoop/2.7.2/libexec/lib:
/usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/common/lib:/Library/Java/JavaVirtualMachines/jdk1.8.0_92.jdk/Contents/Home//jre/lib
(Note: this also means that the answer to Hadoop “Unable to load native-hadoop library for your platform” error on docker-spark? does not work for me..)
But checknative still returns uniformly false:
$stop-dfs.sh && start-dfs.sh && hadoop checknative
16/06/13 16:12:32 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Stopping namenodes on [sparkbook]
sparkbook: stopping namenode
localhost: stopping datanode
Stopping secondary namenodes [0.0.0.0]
0.0.0.0: stopping secondarynamenode
16/06/13 16:12:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/06/13 16:12:50 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [sparkbook]
sparkbook: starting namenode, logging to /usr/local/Cellar/hadoop/2.7.2/libexec/logs/hadoop-macuser-namenode-sparkbook.out
localhost: starting datanode, logging to /usr/local/Cellar/hadoop/2.7.2/libexec/logs/hadoop-macuser-datanode-sparkbook.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /usr/local/Cellar/hadoop/2.7.2/libexec/logs/hadoop-macuser-secondarynamenode-sparkbook.out
16/06/13 16:13:05 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/06/13 16:13:05 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Native library checking:
hadoop: false
zlib: false
snappy: false
lz4: false
bzip2: false
openssl: false
To get this working on a fresh install of macOS 10.12, I had to do the following:
Install build dependencies using homebrew:
brew install cmake maven openssl protobuf#2.5 snappy
Check out hadoop source code
git clone https://github.com/apache/hadoop.git
cd hadoop
git checkout rel/release-2.7.3
Apply the below patch to the build:
diff --git a/hadoop-common-project/hadoop-common/src/CMakeLists.txt b/hadoop-common-project/hadoop-common/src/CMakeLists.txt
index 942b19c..8b34881 100644
--- a/hadoop-common-project/hadoop-common/src/CMakeLists.txt
+++ b/hadoop-common-project/hadoop-common/src/CMakeLists.txt
## -16,6 +16,8 ##
# limitations under the License.
#
+SET(CUSTOM_OPENSSL_PREFIX /usr/local/opt/openssl)
+
cmake_minimum_required(VERSION 2.6 FATAL_ERROR)
# Default to release builds
## -116,8 +118,8 ## set(T main/native/src/test/org/apache/hadoop)
GET_FILENAME_COMPONENT(HADOOP_ZLIB_LIBRARY ${ZLIB_LIBRARIES} NAME)
SET(STORED_CMAKE_FIND_LIBRARY_SUFFIXES ${CMAKE_FIND_LIBRARY_SUFFIXES})
-set_find_shared_library_version("1")
-find_package(BZip2 QUIET)
+set_find_shared_library_version("1.0")
+find_package(BZip2 REQUIRED)
if (BZIP2_INCLUDE_DIR AND BZIP2_LIBRARIES)
GET_FILENAME_COMPONENT(HADOOP_BZIP2_LIBRARY ${BZIP2_LIBRARIES} NAME)
set(BZIP2_SOURCE_FILES
diff --git a/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml b/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml
index d2ddf89..ac8e351 100644
--- a/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml
+++ b/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml
## -17,4 +17,8 ##
<!-- Put site-specific property overrides in this file. -->
<configuration>
+<property>
+<name>io.compression.codec.bzip2.library</name>
+<value>libbz2.dylib</value>
+</property>
</configuration>
diff --git a/hadoop-tools/hadoop-pipes/pom.xml b/hadoop-tools/hadoop-pipes/pom.xml
index 34c0110..70f23a4 100644
--- a/hadoop-tools/hadoop-pipes/pom.xml
+++ b/hadoop-tools/hadoop-pipes/pom.xml
## -52,7 +52,7 ##
<mkdir dir="${project.build.directory}/native"/>
<exec executable="cmake" dir="${project.build.directory}/native"
failonerror="true">
- <arg line="${basedir}/src/ -DJVM_ARCH_DATA_MODEL=${sun.arch.data.model}"/>
+ <arg line="${basedir}/src/ -DJVM_ARCH_DATA_MODEL=${sun.arch.data.model} -DOPENSSL_ROOT_DIR=/usr/local/opt/openssl"/>
</exec>
<exec executable="make" dir="${project.build.directory}/native" failonerror="true">
<arg line="VERBOSE=1"/>
Build hadoop from source:
mvn package -Pdist,native -DskipTests -Dtar -Dmaven.javadoc.skip=true
Specify JAVA_LIBRARY_PATH when running hadoop:
$ JAVA_LIBRARY_PATH=/usr/local/opt/openssl/lib:/opt/local/lib:/usr/lib hadoop-dist/target/hadoop-2.7.3/bin/hadoop checknative -a
16/10/14 20:16:32 INFO bzip2.Bzip2Factory: Successfully loaded & initialized native-bzip2 library libbz2.dylib
16/10/14 20:16:32 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop: true /Users/admin/Desktop/hadoop/hadoop-dist/target/hadoop-2.7.3/lib/native/libhadoop.dylib
zlib: true /usr/lib/libz.1.dylib
snappy: true /usr/local/lib/libsnappy.1.dylib
lz4: true revision:99
bzip2: true /usr/lib/libbz2.1.0.dylib
openssl: true /usr/local/opt/openssl/lib/libcrypto.dylib
There are some missing steps in #andrewdotn's response above:
1) For step (3), create the patch by adding the text posted to a text file e.g. "patch.txt", and then execute "git apply patch.txt"
2) In addition to copying the files as directed by javadba, certain applications also require that you set:
export HADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib/native"
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH:${HADOOP_HOME}/lib/native
export JAVA_LIBRARY_PATH=$JAVA_LIBRARY_PATH:${HADOOP_HOME}/lib/native
The needed step is to copy the *.dylib from the git sources build dir into the $HADOOP_HOME/<common dir>lib dir for your platform . For OS/X installed via brew it is:
cp /git/hadoop/hadoop-dist/target/hadoop-2.7.1/lib/native/ /usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/common/
We can see the required libs there now:
$ll /usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/common/*.dylib
-rwxr-xr-x 1 macuser staff 149100 Jun 13 13:44 /usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/common/libhadoop.dylib
-rwxr-xr-x 1 macuser staff 149100 Jun 13 13:44 /usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/common/libhadoop.1.0.0.dylib
And now the hadoop checknative command works:
$hadoop checknative
6/06/15 09:10:59 INFO zlib.ZlibFactory: Successfully loaded & initialized native-zlib library
Native library checking:
hadoop: true /usr/local/Cellar/hadoop/2.7.2/libexec/share/hadoop/common/libhadoop.dylib
zlib: true /usr/lib/libz.1.dylib
snappy: false
lz4: true revision:99
bzip2: false
openssl: false build does not support openssl.
As an update to #andrewdotn answer, here is the patch.txt file to be used with Hadoop 2.8.1:
diff --git a/hadoop-common-project/hadoop-common/src/CMakeLists.txt b/hadoop-common-project/hadoop-common/src/CMakeLists.txt
index c93bfe78546..e8918f9ca29 100644
--- a/hadoop-common-project/hadoop-common/src/CMakeLists.txt
+++ b/hadoop-common-project/hadoop-common/src/CMakeLists.txt
## -20,6 +20,8 ##
# CMake configuration.
#
+SET(CUSTOM_OPENSSL_PREFIX /usr/local/opt/openssl)
+
cmake_minimum_required(VERSION 2.6 FATAL_ERROR)
list(APPEND CMAKE_MODULE_PATH ${CMAKE_SOURCE_DIR}/..)
## -50,8 +52,8 ## get_filename_component(HADOOP_ZLIB_LIBRARY ${ZLIB_LIBRARIES} NAME)
# Look for bzip2.
set(STORED_CMAKE_FIND_LIBRARY_SUFFIXES ${CMAKE_FIND_LIBRARY_SUFFIXES})
-hadoop_set_find_shared_library_version("1")
-find_package(BZip2 QUIET)
+hadoop_set_find_shared_library_version("1.0")
+find_package(BZip2 REQUIRED)
if(BZIP2_INCLUDE_DIR AND BZIP2_LIBRARIES)
get_filename_component(HADOOP_BZIP2_LIBRARY ${BZIP2_LIBRARIES} NAME)
set(BZIP2_SOURCE_FILES
diff --git a/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml b/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml
index d2ddf893e49..ac8e351f1c8 100644
--- a/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml
+++ b/hadoop-common-project/hadoop-common/src/main/conf/core-site.xml
## -17,4 +17,8 ##
<!-- Put site-specific property overrides in this file. -->
<configuration>
+<property>
+<name>io.compression.codec.bzip2.library</name>
+<value>libbz2.dylib</value>
+</property>
</configuration>
diff --git a/hadoop-tools/hadoop-pipes/pom.xml b/hadoop-tools/hadoop-pipes/pom.xml
index 8aafad0f7eb..d4832542265 100644
--- a/hadoop-tools/hadoop-pipes/pom.xml
+++ b/hadoop-tools/hadoop-pipes/pom.xml
## -55,7 +55,7 ##
<mkdir dir="${project.build.directory}/native"/>
<exec executable="cmake" dir="${project.build.directory}/native"
failonerror="true">
- <arg line="${basedir}/src/ -DJVM_ARCH_DATA_MODEL=${sun.arch.data.model}"/>
+ <arg line="${basedir}/src/ -DJVM_ARCH_DATA_MODEL=${sun.arch.data.model} -DOPENSSL_ROOT_DIR=/usr/local/opt/openssl"/>
</exec>
<exec executable="make" dir="${project.build.directory}/native" failonerror="true">
<arg line="VERBOSE=1"/>
Related
I'm using mac and java version:
$java -version
java version "1.8.0_111"
Java(TM) SE Runtime Environment (build 1.8.0_111-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.111-b14, mixed mode)
followed this link: https://dtflaneur.wordpress.com/2015/10/02/installing-hadoop-on-mac-osx-el-capitan/
I first brew install hadoop, config ssh connection and xml files as required, and
start-dfs.sh
start-yarn.sh
The screen output is like this:
$start-dfs.sh
17/05/06 09:58:32 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: namenode running as process 74213. Stop it first.
localhost: starting datanode, logging to /usr/local/Cellar/hadoop/2.7.3/libexec/logs/hadoop-x-datanode-xdeMacBook-Pro.local.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: secondarynamenode running as process 74417. Stop it first.
17/05/06 09:58:39 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
$start-dfs.sh
17/05/06 09:58:32 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: namenode running as process 74213. Stop it first.
localhost: starting datanode, logging to /usr/local/Cellar/hadoop/2.7.3/libexec/logs/hadoop-x-datanode-xdeMacBook-Pro.local.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: secondarynamenode running as process 74417. Stop it first.
17/05/06 09:58:39 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Then using jps I cannot see "DataNode" and "ResourceManager". I suppose DataNode is hdfs module and ResourceManager is yarn module:
$jps
74417 SecondaryNameNode
75120 Jps
74213 NameNode
74539 ResourceManager
74637 NodeManager
I can list hdfs files:
$hdfs dfs -ls /
17/05/06 09:58:59 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 1 items
drwxr-xr-x - x supergroup 0 2017-05-05 23:50 /user
But running the pi examples throws exception:
$hadoop jar /usr/local/Cellar/hadoop/2.7.3/libexec/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.3.jar pi 2 5
Number of Maps = 2
Samples per Map = 5
17/05/06 10:19:48 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/05/06 10:19:49 WARN hdfs.DFSClient: DataStreamer Exception
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/x/QuasiMonteCarlo_1494037188550_135794067/in/part0 could only be replicated to 0 nodes instead of minReplication (=1). There are 0 datanode(s) running and no node(s) are excluded in this operation.
I wonder if I missed any configuation, how can I make sure that they run successfully, and how to check or trouble shoot possible failure reasons?
Thanks.
I am too in learning phase yet. This error comes when there is no datanode available to read/write.
You can check Resource Manager using this URL: http://localhost:50070
Is there any datanode running or not.
For trouble shooting you can check logs generated under installation directory of hadoop . If you can share that logs i can try to help.
I have build Hadoop 2.7.3 from source, all succeeded. I am using a prebuild Spark 2.0 binary with Hadoop 2.7 support. When I start the spark-shell, I got this warning.
16/09/23 14:53:24 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
hadoop checknative -a gives me
16/09/23 14:59:47 WARN bzip2.Bzip2Factory: Failed to load/initialize native-bzip2 library system-native, will use pure-Java version
16/09/23 14:59:47 WARN zlib.ZlibFactory: Failed to load/initialize native-zlib library
Native library checking:
hadoop: true D:\hadoop-2.7.3\bin\hadoop.dll
zlib: false
snappy: false
lz4: true revision:99
bzip2: false
openssl: false build does not support openssl.
winutils: true D:\hadoop-2.7.3\bin\winutils.exe
16/09/23 14:59:47 INFO util.ExitUtil: Exiting with status 1
Do I have to get native build for all the libraries? I checked the Hadoop build instruction, and I could not find any information about build the other libraries.
Or maybe there's some miss configuration in my Spark. But I could not figure out what. I have these environment variable set for my Spark:
set HADOOP_HOME=D:/hadoop-2.7.3
set HADOOP_CONF_DIR=%HADOOP_HOME%/etc/hadoop
set SPARK_HOME=D:/spark-2.0.0-bin-hadoop2.7
set HADOOP_COMMON_LIB_NATIVE_DIR=%HADOOP_HOME%/bin
set SPARK_LOCAL_IP=
alpesh#alpesh-Inspiron-3647:~/hadoop-2.7.2/sbin$ hadoop fs -ls
16/07/05 13:59:17 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
It is also showing me the the output as below
hadoop check native -a
16/07/05 14:00:42 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Native library checking:
hadoop: false
zlib: false
snappy: false
lz4: false
bzip2: false
openssl: false
16/07/05 14:00:42 INFO util.ExitUtil: Exiting with status 1
Please help me to solve this
Library you are using is compiled for 32 bit and you are using 64 bit version. so open your .bashrc file where configuration for hadoop exists. Go to this line
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
and replace it with
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib/native"
To get rid of this error:
Suppose Jar file is at /home/cloudera/test.jar and class file is at /home/cloudera/workspace/MapReduce/bin/mapreduce/WordCount, where mapreduce is the package name.
Input file mytext.txt is at /user/process/mytext.txt and output file location is /user/out.
We should run this mapreduce program in following way:
$hadoop jar /home/cloudera/bigdata/text.jar mapreduce.WordCount /user/process /user/out
Add these properties in bash profile of Hadoop user, the issue will be solved
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_COMMON_LIB_NATIVE_DIR"
It's just a warning, because it can not find the correct .jar. Either by compiling it or because it does not exist.
If I were you, I would simply omit it
To do that add in the corresponding configuration file
log4j.logger.org.apache.hadoop.util.NativeCodeLoader=ERROR
I am running into the following issues after using homebrew to install hadoop. I followed the guide here:
http://glebche.appspot.com/static/hadoop-ecosystem/hadoop-hive-tutorial.html
Setting the following environment variables in bashrc:
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_55.jdk/Contents/Home
export HADOOP_INSTALL=/usr/local/Cellar/hadoop/2.3.0
export HADOOP_HOME=$HADOOP_INSTALL
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
After running a hadoop namenode -format.. I attempt to run start-dfs.sh and get the following:
14/05/05 21:19:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: set hadoop variables
localhost: starting namenode, logging to /usr/local/Cellar/hadoop/2.3.0/libexec/logs/mynotebook.local.out
localhost: Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode
localhost: set hadoop variables
localhost: starting datanode, logging to /usr/local/Cellar/hadoop/2.3.0/libexec/logs/mynotebook.local.out
localhost: Error: Could not find or load main class org.apache.hadoop.hdfs.server.datanode.DataNode
Starting secondary namenodes [0.0.0.0]
0.0.0.0: set hadoop variables
0.0.0.0: secondarynamenode running as process 12747. Stop it first.
14/05/05 21:19:37 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
How to I get around this issue?
Based on the first line of the second message,
"14/05/05 21:19:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable"
I suppose that you're running hadoop in a 64 bit operating system. Hadoop is built from default in a 32 bit system, I had the same issue and the same message. What you have to do is re-build hadoop from the source on your system.
I suggest you to use the guide below, it's for the 2.2 version but it's ok for the 2.3 version too
http://csrdu.org/nauman/2014/01/23/geting-started-with-hadoop-2-2-0-building/
Or the official guide
http://hadoop.apache.org/docs/r2.3.0/hadoop-project-dist/hadoop-common/NativeLibraries.html#Build
i want to run hadoop in my arch linux but i have this error, how i can fix it?
[]# . /usr/lib/hadoop-2.2.0/sbin/start-all.sh
This script is Deprecated. Instead use start-dfs.sh and start-yarn.sh
Incorrect configuration: namenode address dfs.namenode.servicerpc-address or dfs.namenode.rpc-address is not configured.
Starting namenodes on [OpenJDK 64-Bit Server VM warning: You have loaded library /usr/lib/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
2013-12-10 23:21:42,602 WARN [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable]
Error: Please specify one of --hosts or --hostnames options and not both.
cat: /etc/hadoop/slaves: No such file or directory
Starting secondary namenodes [OpenJDK 64-Bit Server VM warning: You have loaded library /usr/lib/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
2013-12-10 23:21:44,192 WARN [main] util.NativeCodeLoader (NativeCodeLoader.java:<clinit>(62)) - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
0.0.0.0]
Error: Please specify one of --hosts or --hostnames options and not both.
starting yarn daemons
starting resourcemanager, logging to /usr/lib/hadoop-2.2.0/logs/yarn-vahid-resourcemanager-kharazi.out
2013-12-10 23:21:47,901 INFO [main] resourcemanager.ResourceManager (StringUtils.java:startupShutdownMessage(601)) - STARTUP_MSG:
/************************************************************
STARTUP_MSG: Starting ResourceManager
STARTUP_MSG: host = kharazi/192.168.1.3
STARTUP_MSG: args = []
STARTUP_MSG: version = 2.2.0
STARTUP_MSG: classpath = /etc/hadoop:/etc/hadoop:/etc/hadoop:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jackson-jaxrs-1.8.8.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/activation-1.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/paranamer-2.3.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/commons-el-1.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/commons-logging-1.1.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jersey-server-1.9.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jsp-api-2.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/commons-configuration-1.6.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/zookeeper-3.4.5.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/log4j-1.2.17.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/xz-1.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jersey-json-1.9.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/commons-lang-2.5.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jsch-0.1.42.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/commons-digester-1.8.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/hadoop-annotations-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/xmlenc-0.52.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/hadoop-auth-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/stax-api-1.0.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/commons-cli-1.2.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/junit-4.8.2.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/commons-httpclient-3.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/commons-collections-3.2.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jasper-compiler-5.5.23.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/commons-net-3.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jetty-6.1.26.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jettison-1.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/commons-math-2.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jersey-core-1.9.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/slf4j-api-1.7.5.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/commons-io-2.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jasper-runtime-5.5.23.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/commons-codec-1.4.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/avro-1.7.4.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jets3t-0.6.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/servlet-api-2.5.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jaxb-api-2.2.2.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/guava-11.0.2.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jackson-xc-1.8.8.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jsr305-1.3.9.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/lib/asm-3.2.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/hadoop-common-2.2.0-tests.jar:/usr/lib/hadoop-2.2.0/share/hadoop/common/hadoop-nfs-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-el-1.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-logging-1.1.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-server-1.9.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/jsp-api-2.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/log4j-1.2.17.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-lang-2.5.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/xmlenc-0.52.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-cli-1.2.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-daemon-1.0.13.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-6.1.26.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/jersey-core-1.9.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-io-2.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/jasper-runtime-5.5.23.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/commons-codec-1.4.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/servlet-api-2.5.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/guava-11.0.2.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/jsr305-1.3.9.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/lib/asm-3.2.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-nfs-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/hdfs/hadoop-hdfs-2.2.0-tests.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/paranamer-2.3.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/hamcrest-core-1.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/javax.inject-1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-3.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-server-1.9.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/log4j-1.2.17.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/xz-1.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/hadoop-annotations-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/junit-4.10.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/jersey-core-1.9.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/commons-io-2.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/avro-1.7.4.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/aopalliance-1.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/lib/asm-3.2.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-shuffle-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-jobclient-2.2.0-tests.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-plugins-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-hs-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-common-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-app-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/mapreduce/hadoop-mapreduce-client-core-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-resourcemanager-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-nodemanager-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-site-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-web-proxy-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-unmanaged-am-launcher-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-tests-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-api-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-client-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-common-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-applications-distributedshell-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/hadoop-yarn-server-common-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/paranamer-2.3.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/hamcrest-core-1.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/javax.inject-1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/guice-3.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/protobuf-java-2.5.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-server-1.9.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/snappy-java-1.0.4.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/log4j-1.2.17.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/xz-1.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/hadoop-annotations-2.2.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/junit-4.10.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-guice-1.9.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/commons-compress-1.4.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/jersey-core-1.9.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/commons-io-2.1.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/avro-1.7.4.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/aopalliance-1.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/guice-servlet-3.0.jar:/usr/lib/hadoop-2.2.0/share/hadoop/yarn/lib/asm-3.2.jar:/etc/hadoop/rm-config/log4j.properties
STARTUP_MSG: build = https://svn.apache.org/repos/asf/hadoop/common -r 1529768; compiled by 'hortonmu' on 2013-10-07T06:28Z
STARTUP_MSG: java = 1.7.0_45
************************************************************/
cat: /etc/hadoop/slaves: No such file or directory
cat: /etc/hadoop/slaves: No such file or directory
You need to fill in the /etc/hadoop/slaves file with the location of all your slave nodes. You put one slave node hostname per line. Example:
host1
host2
host3
Make sure you can ssh into these nodes without a password.