I installed hadoop on my mac and tried to start it using start-dfs.sh. However, I kept getting the Warning message: WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable. Any suggestions on what files to modify?
Ms-MacBook-Pro% ./start-dfs.sh
Starting namenodes on [localhost]
Starting datanodes
localhost: datanode is running as process 48267. Stop it first.
Starting secondary namenodes [Ms-MacBook-Pro.local]
Ms-MacBook-Pro.local: secondarynamenode is running as process 48390. Stop it first.
2019-08-31 22:39:06,151 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Related
I'm using mac and java version:
$java -version
java version "1.8.0_111"
Java(TM) SE Runtime Environment (build 1.8.0_111-b14)
Java HotSpot(TM) 64-Bit Server VM (build 25.111-b14, mixed mode)
followed this link: https://dtflaneur.wordpress.com/2015/10/02/installing-hadoop-on-mac-osx-el-capitan/
I first brew install hadoop, config ssh connection and xml files as required, and
start-dfs.sh
start-yarn.sh
The screen output is like this:
$start-dfs.sh
17/05/06 09:58:32 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: namenode running as process 74213. Stop it first.
localhost: starting datanode, logging to /usr/local/Cellar/hadoop/2.7.3/libexec/logs/hadoop-x-datanode-xdeMacBook-Pro.local.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: secondarynamenode running as process 74417. Stop it first.
17/05/06 09:58:39 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
$start-dfs.sh
17/05/06 09:58:32 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: namenode running as process 74213. Stop it first.
localhost: starting datanode, logging to /usr/local/Cellar/hadoop/2.7.3/libexec/logs/hadoop-x-datanode-xdeMacBook-Pro.local.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: secondarynamenode running as process 74417. Stop it first.
17/05/06 09:58:39 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Then using jps I cannot see "DataNode" and "ResourceManager". I suppose DataNode is hdfs module and ResourceManager is yarn module:
$jps
74417 SecondaryNameNode
75120 Jps
74213 NameNode
74539 ResourceManager
74637 NodeManager
I can list hdfs files:
$hdfs dfs -ls /
17/05/06 09:58:59 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Found 1 items
drwxr-xr-x - x supergroup 0 2017-05-05 23:50 /user
But running the pi examples throws exception:
$hadoop jar /usr/local/Cellar/hadoop/2.7.3/libexec/share/hadoop/mapreduce/hadoop-mapreduce-examples-2.7.3.jar pi 2 5
Number of Maps = 2
Samples per Map = 5
17/05/06 10:19:48 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/05/06 10:19:49 WARN hdfs.DFSClient: DataStreamer Exception
org.apache.hadoop.ipc.RemoteException(java.io.IOException): File /user/x/QuasiMonteCarlo_1494037188550_135794067/in/part0 could only be replicated to 0 nodes instead of minReplication (=1). There are 0 datanode(s) running and no node(s) are excluded in this operation.
I wonder if I missed any configuation, how can I make sure that they run successfully, and how to check or trouble shoot possible failure reasons?
Thanks.
I am too in learning phase yet. This error comes when there is no datanode available to read/write.
You can check Resource Manager using this URL: http://localhost:50070
Is there any datanode running or not.
For trouble shooting you can check logs generated under installation directory of hadoop . If you can share that logs i can try to help.
I am installing Hadoop 2.7.3 on my Ubuntu 16.0.4 system. I am getting following error while trying to execute start-dfs.sh. I have checked all configuration files.
node#hellbot:~$ start-dfs.sh
17/01/28 20:46:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: starting namenode, logging to /usr/local/hadoop/logs/hadoop-node-namenode-hellbot.out
localhost: starting datanode, logging to /usr/local/hadoop/logs/hadoop-node-datanode-hellbot.out
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /usr/local/hadoop/logs/hadoop-node-secondarynamenode-hellbot.out
17/01/28 20:46:42 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Thanks in advance
I'm getting error while running start-dfs.sh
start-dfs.sh
16/10/02 23:10:54 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: starting namenode, logging to /opt/hadoop/logs/hadoop-root-namenode-Web.out
localhost: nice: /home/hadoop/hadoop/bin/hdfs: No such file or directory
localhost: starting datanode, logging to /opt/hadoop/logs/hadoop-root-datanode-Web.out
localhost: nice: /home/hadoop/hadoop/bin/hdfs: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: starting secondarynamenode, logging to /opt/hadoop/logs/hadoop-root-secondarynamenode-Web.out
0.0.0.0: nice: /home/hadoop/hadoop/bin/hdfs: No such file or directory
16/10/02 23:11:12 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Looks like you are missing the hadoop home env var.
export HADOOP_HOME=/opt/hadoop
then try is anything working
hadoop version
Issues like yours depends on lack of env var.
I am using Rancher for manage an environment , I am using Hadoop + Yarn (Experimental) for flink and zookeeper in rancher .
I am trying to configure the hdfs on the flink-conf.yaml.
This is the changes that I made in connection to Hdfs :
fs.hdfs.hadoopconf: /etc/hadoop
recovery.zookeeper.storageDir: hdfs://:8020/flink/recovery
state.backend.fs.checkpointdir: hdfs://:8020/flink/checkpoints
And I get an error that say that :
2016-09-06 14:10:44,965 WARN org.apache.hadoop.util.NativeCodeLoader - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
What I did wrong ?
Best regards
I am running into the following issues after using homebrew to install hadoop. I followed the guide here:
http://glebche.appspot.com/static/hadoop-ecosystem/hadoop-hive-tutorial.html
Setting the following environment variables in bashrc:
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.7.0_55.jdk/Contents/Home
export HADOOP_INSTALL=/usr/local/Cellar/hadoop/2.3.0
export HADOOP_HOME=$HADOOP_INSTALL
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
After running a hadoop namenode -format.. I attempt to run start-dfs.sh and get the following:
14/05/05 21:19:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Starting namenodes on [localhost]
localhost: set hadoop variables
localhost: starting namenode, logging to /usr/local/Cellar/hadoop/2.3.0/libexec/logs/mynotebook.local.out
localhost: Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode
localhost: set hadoop variables
localhost: starting datanode, logging to /usr/local/Cellar/hadoop/2.3.0/libexec/logs/mynotebook.local.out
localhost: Error: Could not find or load main class org.apache.hadoop.hdfs.server.datanode.DataNode
Starting secondary namenodes [0.0.0.0]
0.0.0.0: set hadoop variables
0.0.0.0: secondarynamenode running as process 12747. Stop it first.
14/05/05 21:19:37 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
How to I get around this issue?
Based on the first line of the second message,
"14/05/05 21:19:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable"
I suppose that you're running hadoop in a 64 bit operating system. Hadoop is built from default in a 32 bit system, I had the same issue and the same message. What you have to do is re-build hadoop from the source on your system.
I suggest you to use the guide below, it's for the 2.2 version but it's ok for the 2.3 version too
http://csrdu.org/nauman/2014/01/23/geting-started-with-hadoop-2-2-0-building/
Or the official guide
http://hadoop.apache.org/docs/r2.3.0/hadoop-project-dist/hadoop-common/NativeLibraries.html#Build