Hadoop 2.1.0-beta wordcount example error - hadoop

I'm new at hadoop and a bit confused.. My version is 2.1.0-beta and I followed the guide for the cluster setup (http://hadoop.apache.org/docs/stable/cluster_setup.html).
I'm trying to run the wordcount example as in http://wiki.apache.org/hadoop/WordCount.
The command
./hadoop dfs -copyFromLocal /home/user/input/inputfile /opt/hdfsdata/
gives me :
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
13/09/22 20:41:06 WARN conf.Configuration: bad conf file: element not
13/09/22 20:41:06 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
13/09/22 20:41:06 WARN conf.Configuration: bad conf file: element not
13/09/22 20:41:06 WARN conf.Configuration: bad conf file: element not
copyFromLocal: `/opt/hdfsdata/': No such file or directory
/opt/hdfsdata does exist.
Thank you for any hint!

/opt/hdfsdata probably represents a path on your local FS, while the command copyFromLocal expects an HDFS path. Make sure have this path existing on your HDFS, or have permissions to create it inside HDFS.
If you want to use it with local FS, use the complete path with proper scheme :
file:///opt/hdfsdata. But why would you use an HDFS command for this. What's the problem with normal cp??
In response to your comment :
You have copied the file into your local FS, which is file:///opt/hdfsdata/, but your job is looking for this path inside HDFS. This is why you are getting this error. And this is why dfs -ls is not showing anything. Either copy the file inside HDFS or use the local path in your job.
Try this :
bin/hadoop fs -mkdir /opt/hdfsdata/
bin/hadoop fs -copyFromLocal /home/user/input/inputfile /opt/hdfsdata/
Now run your job.
Also, no need to use hdfs:/ while running HDFS shell command.

go through the below link it will gives you solution of your wordcount program.
http://cs.smith.edu/dftwiki/index.php/Hadoop_Tutorial_1_--_Running_WordCount#Basic_Hadoop_Admin_Commands
OR
run the below command,,
hadoop dfs -ls /opt/hdfsdata
/**if this command return your /opt/hdfsdata is directory then you can
easily write your file into hdfsdata directory. */
if this command return no such file or directory run below command.
hadoop dfs -mkdir /opt/hdfsdata

Related

Unable to create a directory on hdfs on mac os

I am getting the below error message when I try to create a directory on hdfs.
I installed all the required softwares ssh, Java and set all the environment variables.
Not really sure where am I going wrong.
Could anyone share your thoughts on this? Thanks.
Command used:
bin/hdfs dfs -mkdir /Users/ravitejavutukuri/input
Error:
18/06/30 22:56:11 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: `/Users/ravitejavutukuri/input': No such file or directory
Currently I installed Hadoop 2.9.1 and I'm trying to experiment with pseudo-distributed-mode.
Try this command. It will create all the directories in the path.
bin/hdfs dfs -mkdir -p /Users/ravitejavutukuri/input
HDFS has no /Users directory (its not a Mac equivalent structure)
Did you mean /user?
The correct way to make a user directory for yourself would be
hdfs dfs -mkdir -p /user/$whoami/
hdfs dfs -chmod -R 750 /user/$whoami/
Then to make an input directory, not giving an absolute path automatically uses your HDFS user folder
hdfs dfs -mkdir input/

Unable to run hadoop commands

I have installed Hadoop on single node. While executing hadoop fs -ls i'm getting below:
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
As #franklinsijo said, it is just a warning, it won't affect your activities.
Coming to your hadoop fs -ls It says '.' is not a directory
If you execute hadoop fs -ls, it means hadoop fs -ls /user/your_user_id. It will show files from your home dir. So If you put any files in your home, then only it will list the files.
So try by giving the path like hadoop fs -ls / or hadoop fs -ls /user/
Can you try to list the directory. You can try with
"hadoop fs - ls /"
Or try creating one hadoop fs -mkdir /test

hadoop copy a local file to Hadoop SF error

i want to copy a local file in Hadoop FS. i run this command:
sara#ubuntu:/usr/lib/hadoop/hadoop-2.3.0/bin$ hadoop fs -copyFromLocal /home/sara/Downloads/CA-GrQc.txt /usr/lib/hadoop/hadoop-2.3.0/${HADOOP_HOME}/hdfs/namenode
and
sara#ubuntu:/usr/lib/hadoop/hadoop-2.3.0/bin$ hdfs dfs -copyFromLocal /home/sara/Downloads/CA-GrQc.txt /usr/lib/hadoop/hadoop-2.3.0/${HADOOP_HOME}/hdfs/namenode
and even if i run : hdfs dfs -ls
i get this error:
> WARN util.NativeCodeLoader: Unable to load native-hadoop library for
> your platform... using builtin-java classes where applicable
> copyFromLocal: `.': No such file or directory
i don't know why i get this error? Any idea please?
According to your input your Hadoop installation seems to be working fine. What is wrong, it that hadoop fs -copyFromLocal expect the directory HDFS directory as target directory, but not the local directory where the Hadoop stores its blocks.
So in you case the command should look like(for example):
sara#ubuntu:/usr/lib/hadoop/hadoop-2.3.0/bin$ hdfs dfs -copyFromLocal /home/sara/Downloads/CA-GrQc.txt /sampleDir/
Where the sampleDir is the directory you create with hadoop fs -mkdir command.

hadoop fs -ls results in "no such file or directory"

I have installed and configured Hadoop 2.5.2 for a 10 node cluster. 1 is acting as masternode and other nodes as slavenodes.
I have problem in executing hadoop fs commands. hadoop fs -ls command is working fine with HDFS URI. It gives message "ls: `.': No such file or directory" when used without HDFS URI
ubuntu#101-master:~$ hadoop fs -ls
15/01/30 17:03:49 WARN util.NativeCodeLoader: Unable to load native-hadoop
ibrary for your platform... using builtin-java classes where applicable
ls: `.': No such file or directory
ubuntu#101-master:~$
Whereas, executing the same command with HDFS URI
ubuntu#101-master:~$ hadoop fs -ls hdfs://101-master:50000/
15/01/30 17:14:31 WARN util.NativeCodeLoader: Unable to load native-hadoop
library for your platform... using builtin-java classes where applicable
Found 3 items
drwxr-xr-x - ubuntu supergroup 0 2015-01-28 12:07 hdfs://101-master:50000/hvision-data
-rw-r--r-- 2 ubuntu supergroup 15512587 2015-01-28 11:50 hdfs://101-master:50000/testimage.seq
drwxr-xr-x - ubuntu supergroup 0 2015-01-30 17:03 hdfs://101-master:50000/wrodcount-in
ubuntu#101-master:~$
I am getting exception in MapReduce program due to this behavior. jarlib is referring to the HDFS file location, whereas, I want jarlib to refer to the jar files stored at the local file system on the Hadoop nodes.
The behaviour that you are seeing is expected, let me explain what's going on when you are working with hadoop fs commands.
The command's syntax is this: hadoop fs -ls [path]
By default, when you don't specify [path] for the above command, hadoop expands the path to /home/[username] in hdfs; where [username] gets replaced with linux username who is executing the command.
So, when you execute this command:
ubuntu#xad101-master:~$ hadoop fs -ls
the reason you are seeing the error is ls: '.': No such file or directory because hadoop is looking for this path /home/ubuntu, it seems like this path doesn't exist in hdfs.
The reason why this command:
ubuntu#101-master:~$ hadoop fs -ls hdfs://101-master:50000/
is working because, you have explicitly specified [path] and is the root of the hdfs. You can also do the same using this:
ubuntu#101-master:~$ hadoop fs -ls /
which automatically gets evaluated to the root of hdfs.
Hope, this clears the behaviour you are seeing while executing hadoop fs -ls command.
Hence, if you want to specify local file system path use file:/// url scheme.
this has to do with the missing home directory for the user. Once I created the home directory under the hdfs for the logged in user, it worked like a charm..
hdfs dfs -mkdir /user
hdfs dfs -mkdir /user/{loggedin user}
hdfs dfs -ls
this method fixed my problem.
The user directory in Hadoop is (in HDFS)
/user/<your operational system user>
If you get this error message it may be because you have not yet created your user directory within HDFS.
Use
hadoop fs -mkdir -p /user/<current o.p. user directory>
To see what is your current operational system user, use:
id -un
hadoop fs -ls it should start working...
There are a couple things at work here; based on "jarlib is referring to the HDFS file location", it sounds like you indeed have an HDFS path set as your fs.default.name, which is indeed the typical setup. So, when you type hadoop fs -ls, this is indeed trying to look inside HDFS, except it's looking in your current working directory, which should be something like hdfs://101-master:50000/user/ubuntu. The error message is unfortunately somewhat confusing since it doesn't tell you that . was interpreted to be that full path. If you hadoop fs -mkdir /user/ubuntu then hadoop fs -ls should start working.
This problem is unrelated to your "jarlib" problem; whenever you want to refer files explicitly stored in the local filesystem, but where the path goes through Hadoop's Path resolution, you simply need to add file:/// to force Hadoop to refer to the local filesystem. For example:
hadoop fs -ls file:///tmp
Try passing your jar file paths as fille file:///path/to/your/jarfile and it should work.
WARN util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
This error will be removed using this command in .bashrc file:
export HADOOP_OPTS="$HADOOP_OPTS -Djava.library.path=/usr/local/hadoop/lib/native"
------------------------------------------------------
/usr/local/hadoop is location where hadoop is install
-------------------------------------------------------

unable to load sample file into hadoop 2.2.0

I tried to install 2.2.0 pseudo mode,while I try to run copyfromlocal to copy a sample data
i used /input in destination path now, like-bin/hadoop fs -copyFromLocal /home/prassanna/Desktop/input /input
i think its worked now and i verified the file using below,
bin/hadoop fs -ls /input
14/03/12 09:31:57 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform...
using builtin-java classes where applicable
Found
1 items
-rw-r--r-- 1 root supergroup 64
and i also checked in uI of datanode,
but its showing used % is '0' only,but it has to show some kb's(64) of the file right?Please tell is the input file copied to hdfs properly ?**
and tell me where the file is physically stored in local machine exactly?Please help to solve this confusion.Thanks in Advance
If your source path is missing then you have to check for the existence of file on your local machine.
But if destination folder is not missing then first try to check then existence of that folder on HDFS.
For that you can open Web UI of hadoop HDFS by :50070 and then Browse the file system
Alternative to this you can check files through Command
hadoop fs -ls /<path of HDFS directory >
If this works then put file with following command
hadoop fs -put <local file path> <path of HDFS directory>
If any of these doesn't work then your hadoop is missing some important configuration
If your Web UI is opening but command is not running then try like this
hadoop fs -ls hdfs://<hadoop Master ip>/<path of HDFS directory >
If this works run put command as below
hadoop fs -put <local file path> hdfs://<hadoop Master ip>/<path of HDFS directory >

Resources