Hadoop\HDFS: "no such file or directory" - hadoop

I have installed Hadoop 2.2 on a single machine using this tutorial: http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
Some details were changed a little bit - for example, I used java 8, /hadoop root dir etc. Users, SSH, config keys - the same.
Namenode was successfully formatted:
13/12/22 05:42:31 INFO common.Storage: Storage directory /hadoop/tmp/dfs/name has been successfully formatted.
13/12/22 05:42:31 INFO namenode.FSImage: Saving image file /hadoop/tmp/dfs/name/current/fsimage.ckpt_0000000000000000000 using no compression
13/12/22 05:42:32 INFO namenode.FSImage: Image file /hadoop/tmp/dfs/name/current/fsimage.ckpt_0000000000000000000 of size 198 bytes saved in 0 seconds.
13/12/22 05:42:32 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 0
13/12/22 05:42:32 INFO util.ExitUtil: Exiting with status 0
13/12/22 05:42:32 INFO namenode.NameNode: SHUTDOWN_MSG:
However, not 'mkdir' neither even 'ls' command worked:
$ /hadoop/hadoop/bin/hadoop fs -ls
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
13/12/22 05:39:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ls: `.': No such file or directory
Thanks for any help guys.

Try
hadoop fs -ls /
Tested on hadoop 2.4

In Hadoop 2.4
hdfs dfs -mkdir /input
hdfs dfs -ls /

Worked in my case:
First Get hadoop installed path by :
echo ${HADOOP_INSTALL} //in my case output is : `/user/local/hadoop`
Then create directory at your hadoop installed path, If you know your hadoop installed directory ignore above command
hadoop fs -mkdir -p /user/local/hadoop/your_directory
Here hadoop is directory
Tested on hadoop 2.4

I have verified this worked in Hadoop 2.5
hdfs dfs -mkdir /input
(where /input is the HDFS directory)

Related

hadoop error: util.NativeCodeLoader (hdfs dfs -ls does not work!)

I have seen a lot of folks getting problem with hadoop installation. I went through all the related stackoverflow questions, but could not fix the problem.
The problem is :
hdfs dfs -ls
16/09/27 09:43:42 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ls: `.': No such file or directory
I am using ubuntu 16.04 and I downloaded hadoop stable version 2.7.2 from Apache mirror:
http://apache.spinellicreations.com/hadoop/common/
I have installed java and ssh already.
which java
java is /usr/bin/java
which javac
javac is /usr/bin/javac
which ssh
ssh is /usr/bin/ssh
echo $JAVA_HOME
/usr/lib/jvm/java-9-openjdk-amd64
Note:
sudo update-alternatives --config java
There are 2 choices for the alternative java (providing /usr/bin/java).
Selection Path Priority Status
------------------------------------------------------------
* 0 /usr/lib/jvm/java-9-openjdk-amd64/bin/java 1091 auto mode
1 /usr/lib/jvm/java-8-openjdk-amd64/jre/bin/java 1081 manual mode
2 /usr/lib/jvm/java-9-openjdk-amd64/bin/java 1091 manual mode
Press <enter> to keep the current choice[*], or type selection number:
hadoop environment variables in ~/.bashrc
export JAVA_HOME=/usr/lib/jvm/java-9-openjdk-amd64
export HADOOP_INSTALL=/home/bhishan/hadoop-2.7.2
export PATH=$PATH:$HADOOP_INSTALL/bin
export PATH=$PATH:$HADOOP_INSTALL/sbin
export HADOOP_MAPRED_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_HOME=$HADOOP_INSTALL
export HADOOP_HDFS_HOME=$HADOOP_INSTALL
export YARN_HOME=$HADOOP_INSTALL
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_INSTALL/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib"
export PATH=$PATH:$HADOOP_HOME/bin
Modification of file:
/home/bhishan/hadoop-2.7.2/etc/hadoop/hadoop-env.sh
Added a one line at the end:
export JAVA_HOME=/usr/lib/jvm/java-9-openjdk-amd64
The link to hadoop-env.sh in the pastebin is here:
http://pastebin.com/a3iPjB04
Then I created some empty directories:
/home/bhishan/hadoop-2.7.2/tmp
/home/bhishan/hadoop-2.7.2/etc/hadoop/hadoop_store
/home/bhishan/hadoop-2.7.2/etc/hadoop/hadoop_store/hdfs
/home/bhishan/hadoop-2.7.2etc/hadoop/hadoop_store/hdfs/datanode
/home/bhishan/hadoop-2.7.2/etc/hadoop/hadoop_store/hdfs/namenode
Modifications to the file: /home/bhishan/hadoop-2.7.2/etc/hadoop/hdfs-site.xml
<property>
<name>dfs.replication</name>
<value>1</value>
<description>Default block replication.
The actual number of replications can be specified when the file is created.
The default is used if replication is not specified in create time.
</description>
</property>
<property>
<name>dfs.namenode.name.dir</name>
<value>file:/home/bhishan/hadoop-2.7.2/etc/hadoop/hadoop_store/hdfs/namenode</value>
</property>
<property>
<name>dfs.datanode.data.dir</name>
<value>file:/home/bhishan/hadoop-2.7.2/etc/hadoop/hadoop_store/hdfs/datanode</value>
</property>
The link in the pastebin is this:
http://pastebin.com/cha7ZBr8
Modifications to the file: /home/bhishan/hadoop-2.7.2/etc/hadoop/core-site.xml
is following:
hadoop.tmp.dir
/home/bhishan/hadoop-2.7.2/tmp A base
for other temporary directories.
fs.default.name
hdfs://localhost:54310 The name of the
default file system. A URI whose scheme and authority determine the
FileSystem implementation. The uri's scheme determines the config
property (fs.SCHEME.impl) naming the FileSystem implementation
class. The uri's authority is used to determine the host, port,
etc. for a filesystem.
The link to the pastebin for core-site.xml is this:
http://pastebin.com/D184DuGB
The Modification to file are given below: /home/bhishan/hadoop-2.7.2/etc/hadoop/mapred-site.xml
mapred.job.tracker
localhost:54311 The host and port that
the MapReduce job tracker runs at. If "local", then jobs are run
in-process as a single map and reduce task.
The pastebin link is:
http://pastebin.com/nVxs8nMm
when I type hostname in the terminal it says BP
cat /etc/hosts
127.0.0.1 localhost BP
127.0.1.1 localhost
The following lines are desirable for IPv6 capable hosts
::1 ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters
I have also disabled ipv6
cat /etc/sysctl.conf
net.ipv6.conf.all.disable_ipv6=1
net.ipv6.conf.default.disable_ipv6=1
net.ipv6.conf.lo.disable_ipv6=1
hadoop descriptions
hadoop version
Hadoop 2.7.2
which hadoop
hadoop is /home/bhishan/hadoop-2.7.2/bin/hadoop
which hdfs
hdfs is /home/bhishan/hadoop-2.7.2/bin/hdfs
Restarting hadoop
cd /home/bhishan/hadoop-2.7.2/sbin
stop-dfs.sh
stop-yarn.sh
cd /home/bhishan/hadoop-2.7.2/tmp && rm -Rf *
hadoop namenode -format
start-dfs.sh
start-yarn.sh
Now the error comes
hdfs dfs -ls
16/09/26 23:53:14 WARN util.NativeCodeLoader: Unable to load
native-hadoop library for your platform... using builtin-java classes
where applicable ls: `.': No such file or directory
checking jps
jps
6688 sun.tools.jps.Jps
3909 SecondaryNameNode
3525 NameNode
4327 NodeManager
4184 ResourceManager
3662 DataNode
checknative
hadoop checknative -a
16/09/27 09:28:18 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Native library checking:
hadoop: false
zlib: false
snappy: false
lz4: false
bzip2: false
openssl: false
16/09/27 09:28:18 INFO util.ExitUtil: Exiting with status 1
Then I installed missing libraries:
a) which hadoop gives Hadoop 2.7.2
b) sudo apt-get install --reinstall zlibc zlib1g zlib1g-dev
From synaptic manager I can see following libraries installed:
zlib1g, zlib1g-dev , zlib1g:i386, zlibc
c) Installed snappy and python-snappy.
d) In Synaptic manager I can see lz4
liblz4-1, liblz4-tool, python-lz4, python3-lz4
e) bzip2 is already installed.
f) openssl is already installed.
All checknative are false and I can not run hdfs dfs -ls
I could not find any errors till now. Any help will be appreciated.
Also, I am trying to run hadoop in Single laptop with four cores. The version is 2.7.2, How is version 3.0, If I have to reinstall the hadoop from Scratch, may be I should go with hadoop3. Suggestions will be welcomed.
Related links:
Result of hdfs dfs -ls command
hdfs dfs ls not working after multiple nodes configured
hadoop fs -ls does not work
Namenode not getting started
No Namenode or Datanode or Secondary NameNode to stop
Hadoop 2.6.1 Warning: WARN util.NativeCodeLoader
Hadoop 2.2.0 Setup (Pseudo-Distributed Mode): ERROR// Warn util.NativeCodeLoader: unable to load native-hadoop library
Command "hadoop fs -ls ." does not work
And, also,
hadoop fs -mkdir failed on connection exception
Hadoop cluster setup - java.net.ConnectException: Connection refused
Hadoop (local and host destination do not match) after installing hive
Help will be truly appreciated!
From this error:
hdfs dfs -ls
16/09/27 09:43:42 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ls: `.': No such file or directory
Ignore the warning about the native libraries - the command should work fine even with that warning.
When you run hdfs dfs -ls with no path as you have done, it attempts to list the contents of your home directory in HDFS, which is /user/ by default. In this case, I suspect this issue is simply that your user directory does not exist.
Does it work OK if you run:
hadoop fs -ls /
And then do:
hadoop fs -mkdir -p /user/<your_user_name>
hadoop fs -ls

Folder Not Created with hadoop fs -mkdir

Hey I am installing HIVE in a Hadoop 2.0 Multi Node cluster ,and I am not able to Create folder using this command :
[hadoop#master ~]$ $HADOOP_HOME/bin/hadoop fs -mkdir /tmp
16/07/19 14:20:15 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[hadoop#master ~]$ $HADOOP_HOME/bin/hadoop fs -mkdir -p /user/hive/warehouse
16/07/19 14:24:12 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Importantly I am not able to find the Created Folder ? Where it will go and create I am not sure. Please help.
JPS for Hadoop is working fine:
[hadoop#master ~]$ jps
2977 ResourceManager
2613 DataNode
3093 NodeManager
2822 SecondaryNameNode
2502 NameNode
5642 Jps
The warning you are getting after running -mkdir command does not impact the Hadoop functionality. It's just a warning, just ignore it. See here for details.
About creating directories under root i.e. "/", it is just one-time activity and should be done by superuser. Once you create the root directories like "/tmp", "/user" etc., then you can create user specific foders like "/user/hduser" and own them using commands:
sudo -u hdfs hdfs dfs -mkdir /tmp
OR
sudo -u hdfs hdfs dfs -mkdir -p /user/hive/warehouse
Once you have the main folder ready, just own it with the user who will be using it:
sudo -u hdfs hdfs dfs -chown hduser:hadoop /user/hive/warehouse
If you want to find the files/directories created on HDFS, then you have to interact with HDFS filesystem using CLI commands only
e.g. hdfs dfs -ls /
The data which is created on HDFS has a physical location on your local filesystem also, but you'll not see that location as files and directories. Look for the dfs.namenode.name.dir and dfs.datanode.data.dir properties in 'hdfs-site.xml' under your installation, usually located at: "/usr/local/hadoop/etc/hadoop/hdfs-site.xml"

Create directory in hadoop filesystem

I'm new to hadoop. I am trying to create a directory in hdfs but I am not able to create.
I have logged into "hduser" hence I assumed /home/hduser" pre-exists as Unix fs. So I tried to create hadoop directory using below command.
[hduser#Virus ~]$ hadoop fs -mkdir /home/hduser/mydata/
14/12/03 15:04:53 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: `/home/hduser/mydata/': No such file or directory
After online search, i thought of it is possible that hadoop is not able to understand "/home/hduser" or as I m using hadoop2 where mkdir wont work like Unix command "madir -p" (recursively). Hence I tried to create "/mydata" but no luck.
[hduser#Virus ~]$ hadoop fs -mkdir /mydata
14/12/03 15:09:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: Cannot create directory /mydata. Name node is in safe mode.
I tried to leave the safemode but still issue persists.
[hduser#Virus ~]$ hdfs dfsadmin -safemode leave
14/12/03 15:09:13 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Safe mode is OFF
I also tried with "/user/mydata" as "/user" is the directory which hadoop took as home.
[hduser#Virus ~]$ hadoop fs -mkdir /user/mydata
14/12/03 15:36:20 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: Cannot create directory /user/mydata. Name node is in safe mode.
Now how to debug further?
To leave safe mode, try below command since hadoop dfsadmin -safemode is deprecated in newer distribution of Hadoop:
hdfs dfsadmin -safemode leave
By default, user's home directory in hdfs exists with '/user/hduser' not as /home/hduser'.
If you tried to create directory directly like below then it will be created like '/user/hduser/sampleDir'.
hadoop fs -mkdir /path/to/be/created
On HDFS,
hdfs dfs -mkdir -p /this/is/a/new/directory
Create a directory /user
hadoop fs -mkdir /user
then with your user name
hadoop fs -mkdir /user/yourusername
Now try to creating directory.
List your directory
hadoop fs -ls /
Output:
Found 1 items
drwxr-xr-x - sony supergroup 0 2016-12-10 16:45 /usr
create a directory
hadoop fs -mkdir /app
created successfully and check
hadoop fs -ls /
Output:
Found 2 items
drwxr-xr-x - sony supergroup 0 2016-12-12 04:11 /usr
drwxr-xr-x - sony supergroup 0 2016-12-10 16:45 /app

Command "hadoop fs -ls ." does not work

I think I have installed hadoop correctly. If I do jps I can see the namenode and datanode, no problem.
When I type hadoop fs -ls . I get the error:
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /opt/db/hadoop-2.4.1/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
14/08/08 12:42:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ls: '.': No such file or directory
When I type hadoop dfs -ls . I get the error:
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /opt/db/hadoop-2.4.1/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
14/08/08 12:43:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ls: '.': No such file or directory
And when I type hadoop hdfs -ls . I get the error:
Error: Could not find or load main class hdfs
This is regardless of whether I put '.' or '/' or whatever directory I'm in.
What does this all mean? How can I get normal, expected output? What am I missing?
Use
hdfs dfs -ls ...
I dont think there is such a thing as hadoop hdfs
use the command as follows
bin/hadoop fs -ls /

copyFromLocal: `/user/hduser/gutenberg': No such file or directory

I have followed the guide of michael-noll so far but got stuck here.
hduser#ubuntu:/usr/local/hadoop$ bin/hadoop dfs -copyFromLocal /tmp/gutenberg /user/hduser/gutenberg
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
13/11/11 23:24:13 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
copyFromLocal: `/user/hduser/gutenberg': No such file or directory
hduser#ubuntu:/usr/local/hadoop$
I have tried reformatting the name node with 'Y' but getting same result every time for any arbitrary folder name.
Any ideas?
Solved it by using the command like:
hduser#ubuntu:/usr/local/hadoop$ hdfs dfs -mkdir -p /user/hduser
Hadoop 2.4.1
hadoop fs -mkdir -p /user/hduser/sample
For Hadoop 2.6.0 :
Usage: hadoop fs -mkdir paths
Example:
hadoop/bin fs -mkdir /user/hduser
Ref: http://hortonworks.com/hadoop-tutorial/using-commandline-manage-files-hdfs/

Resources