Create directory in hadoop filesystem - shell

I'm new to hadoop. I am trying to create a directory in hdfs but I am not able to create.
I have logged into "hduser" hence I assumed /home/hduser" pre-exists as Unix fs. So I tried to create hadoop directory using below command.
[hduser#Virus ~]$ hadoop fs -mkdir /home/hduser/mydata/
14/12/03 15:04:53 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: `/home/hduser/mydata/': No such file or directory
After online search, i thought of it is possible that hadoop is not able to understand "/home/hduser" or as I m using hadoop2 where mkdir wont work like Unix command "madir -p" (recursively). Hence I tried to create "/mydata" but no luck.
[hduser#Virus ~]$ hadoop fs -mkdir /mydata
14/12/03 15:09:26 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: Cannot create directory /mydata. Name node is in safe mode.
I tried to leave the safemode but still issue persists.
[hduser#Virus ~]$ hdfs dfsadmin -safemode leave
14/12/03 15:09:13 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Safe mode is OFF
I also tried with "/user/mydata" as "/user" is the directory which hadoop took as home.
[hduser#Virus ~]$ hadoop fs -mkdir /user/mydata
14/12/03 15:36:20 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: Cannot create directory /user/mydata. Name node is in safe mode.
Now how to debug further?

To leave safe mode, try below command since hadoop dfsadmin -safemode is deprecated in newer distribution of Hadoop:
hdfs dfsadmin -safemode leave
By default, user's home directory in hdfs exists with '/user/hduser' not as /home/hduser'.
If you tried to create directory directly like below then it will be created like '/user/hduser/sampleDir'.
hadoop fs -mkdir /path/to/be/created

On HDFS,
hdfs dfs -mkdir -p /this/is/a/new/directory

Create a directory /user
hadoop fs -mkdir /user
then with your user name
hadoop fs -mkdir /user/yourusername
Now try to creating directory.

List your directory
hadoop fs -ls /
Output:
Found 1 items
drwxr-xr-x - sony supergroup 0 2016-12-10 16:45 /usr
create a directory
hadoop fs -mkdir /app
created successfully and check
hadoop fs -ls /
Output:
Found 2 items
drwxr-xr-x - sony supergroup 0 2016-12-12 04:11 /usr
drwxr-xr-x - sony supergroup 0 2016-12-10 16:45 /app

Related

Folder not created with $HADOOP_HOME/bin/hadoop fs -mkdir /user/hive/warehouse

Hey I am installing HIVE in a Hadoop 2.7.3 Single Node cluster ,and I am not able to Create folder using
$HADOOP_HOME/bin/hadoop fs -mkdir /user/hive/warehouse
16/11/11 14:43:25 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: `/user/hive/warehouse': No such file or directory
JPS for Hadoop is working fine:
jps
15411 NodeManager
15285 ResourceManager
15718 Jps
14904 DataNode
14793 NameNode
15116 SecondaryNameNode
you can give with -p, which will create parent path
$HADOOP_HOME/bin/hadoop fs -mkdir -p /user/hive/warehouse
you should create one b one, which means for creating /user/hive/warehouse folder first you should have /user folder and inside it you should have /user/hive folder, then only you can create /user/hive/warehouse folder. You cannot directly create with it.

Folder Not Created with hadoop fs -mkdir

Hey I am installing HIVE in a Hadoop 2.0 Multi Node cluster ,and I am not able to Create folder using this command :
[hadoop#master ~]$ $HADOOP_HOME/bin/hadoop fs -mkdir /tmp
16/07/19 14:20:15 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[hadoop#master ~]$ $HADOOP_HOME/bin/hadoop fs -mkdir -p /user/hive/warehouse
16/07/19 14:24:12 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Importantly I am not able to find the Created Folder ? Where it will go and create I am not sure. Please help.
JPS for Hadoop is working fine:
[hadoop#master ~]$ jps
2977 ResourceManager
2613 DataNode
3093 NodeManager
2822 SecondaryNameNode
2502 NameNode
5642 Jps
The warning you are getting after running -mkdir command does not impact the Hadoop functionality. It's just a warning, just ignore it. See here for details.
About creating directories under root i.e. "/", it is just one-time activity and should be done by superuser. Once you create the root directories like "/tmp", "/user" etc., then you can create user specific foders like "/user/hduser" and own them using commands:
sudo -u hdfs hdfs dfs -mkdir /tmp
OR
sudo -u hdfs hdfs dfs -mkdir -p /user/hive/warehouse
Once you have the main folder ready, just own it with the user who will be using it:
sudo -u hdfs hdfs dfs -chown hduser:hadoop /user/hive/warehouse
If you want to find the files/directories created on HDFS, then you have to interact with HDFS filesystem using CLI commands only
e.g. hdfs dfs -ls /
The data which is created on HDFS has a physical location on your local filesystem also, but you'll not see that location as files and directories. Look for the dfs.namenode.name.dir and dfs.datanode.data.dir properties in 'hdfs-site.xml' under your installation, usually located at: "/usr/local/hadoop/etc/hadoop/hdfs-site.xml"

Command "hadoop fs -ls ." does not work

I think I have installed hadoop correctly. If I do jps I can see the namenode and datanode, no problem.
When I type hadoop fs -ls . I get the error:
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /opt/db/hadoop-2.4.1/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
14/08/08 12:42:44 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ls: '.': No such file or directory
When I type hadoop dfs -ls . I get the error:
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /opt/db/hadoop-2.4.1/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
14/08/08 12:43:27 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ls: '.': No such file or directory
And when I type hadoop hdfs -ls . I get the error:
Error: Could not find or load main class hdfs
This is regardless of whether I put '.' or '/' or whatever directory I'm in.
What does this all mean? How can I get normal, expected output? What am I missing?
Use
hdfs dfs -ls ...
I dont think there is such a thing as hadoop hdfs
use the command as follows
bin/hadoop fs -ls /

Hadoop\HDFS: "no such file or directory"

I have installed Hadoop 2.2 on a single machine using this tutorial: http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
Some details were changed a little bit - for example, I used java 8, /hadoop root dir etc. Users, SSH, config keys - the same.
Namenode was successfully formatted:
13/12/22 05:42:31 INFO common.Storage: Storage directory /hadoop/tmp/dfs/name has been successfully formatted.
13/12/22 05:42:31 INFO namenode.FSImage: Saving image file /hadoop/tmp/dfs/name/current/fsimage.ckpt_0000000000000000000 using no compression
13/12/22 05:42:32 INFO namenode.FSImage: Image file /hadoop/tmp/dfs/name/current/fsimage.ckpt_0000000000000000000 of size 198 bytes saved in 0 seconds.
13/12/22 05:42:32 INFO namenode.NNStorageRetentionManager: Going to retain 1 images with txid >= 0
13/12/22 05:42:32 INFO util.ExitUtil: Exiting with status 0
13/12/22 05:42:32 INFO namenode.NameNode: SHUTDOWN_MSG:
However, not 'mkdir' neither even 'ls' command worked:
$ /hadoop/hadoop/bin/hadoop fs -ls
Java HotSpot(TM) 64-Bit Server VM warning: You have loaded library /hadoop/hadoop-2.2.0/lib/native/libhadoop.so.1.0.0 which might have disabled stack guard. The VM will try to fix the stack guard now.
It's highly recommended that you fix the library with 'execstack -c <libfile>', or link it with '-z noexecstack'.
13/12/22 05:39:33 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
ls: `.': No such file or directory
Thanks for any help guys.
Try
hadoop fs -ls /
Tested on hadoop 2.4
In Hadoop 2.4
hdfs dfs -mkdir /input
hdfs dfs -ls /
Worked in my case:
First Get hadoop installed path by :
echo ${HADOOP_INSTALL} //in my case output is : `/user/local/hadoop`
Then create directory at your hadoop installed path, If you know your hadoop installed directory ignore above command
hadoop fs -mkdir -p /user/local/hadoop/your_directory
Here hadoop is directory
Tested on hadoop 2.4
I have verified this worked in Hadoop 2.5
hdfs dfs -mkdir /input
(where /input is the HDFS directory)

copyFromLocal: `/user/hduser/gutenberg': No such file or directory

I have followed the guide of michael-noll so far but got stuck here.
hduser#ubuntu:/usr/local/hadoop$ bin/hadoop dfs -copyFromLocal /tmp/gutenberg /user/hduser/gutenberg
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
13/11/11 23:24:13 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
copyFromLocal: `/user/hduser/gutenberg': No such file or directory
hduser#ubuntu:/usr/local/hadoop$
I have tried reformatting the name node with 'Y' but getting same result every time for any arbitrary folder name.
Any ideas?
Solved it by using the command like:
hduser#ubuntu:/usr/local/hadoop$ hdfs dfs -mkdir -p /user/hduser
Hadoop 2.4.1
hadoop fs -mkdir -p /user/hduser/sample
For Hadoop 2.6.0 :
Usage: hadoop fs -mkdir paths
Example:
hadoop/bin fs -mkdir /user/hduser
Ref: http://hortonworks.com/hadoop-tutorial/using-commandline-manage-files-hdfs/

Resources