Copying files into HDFS Hadoop - bash

I am currently working on a project for one of my lectures at the university. The task is to download a book from https://www.gutenberg.org/ and copy it into HDFS. I've tried using put <localSrc> <dest> but it didnt work at all.
This is how my code looks in Terminal at the moment:
[cloudera#quickstart ~]$ put <pg16328.txt> <documents>
bash: syntax error near unexpected token `<'
Any help is appreciated. Thanks in advance.
UPDATE 30.05.2017: I haved used following link https://www.cloudera.com/downloads/quickstart_vms/5-10.html to install Hadoop and did not configure anything at all. Only thing I did was to absolve the tutorial Getting started.

It should just be:
hdfs fs -copyFromLocal pg16328.txt /HDFS/path
I'm not familiar with the put command, but have you tried it without the <>s?

If you have successfully extracted and configured Hadoop, then
you should be in hadoop-home directory ( the location where you extracted and configured hadoop)
Then apply the following command
bin/hadoop dfs -put <local file location> <hdfs file location>
or
bin/hdfs dfs -put <local file location> <hdfs file location>
You can do the same with -copyFromLocal command too. Just replace -put with -copyFromLocal in above commands.
for example :
Lets say you have pg16328.txt in your Desktop directory, then the above command would be
bin/hadoop dfs -put /home/cloudera/Desktop/pg16328.txt /user/hadoop/
where /user/hadoop is a directory in hdfs
If /user/hadoop directory doesn't exists then you can create it by
bin/hadoop dfs -mkdir -f /user/hadoop
You can look at the uploaded file using webUI (namenodeIP:50070) or by using command line as
bin/hadoop dfs -ls /user/hadoop/

Related

hadoop 2.7.2 HDFS: no such file or directory

I have this:
I had also tried to edit this:
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib
as
export HADOOP_OPTS="$HADOOP_OPTS-Djava.library.path=$HADOOP_INSTALL/lib
in ~/.bashrc
But still I am getting a warning message and I'm not able to solve the problem.
Unable to create the directory
I'm using this code to create the directory for twitter analysis:
hadoop fs -mkdir hdfs://localhost:54310/home/vipal/hadoop_store/hdfs/namenode/twitter_data
Notice how hadoop fs -ls says .: No such file or directory?
First, you must create your home directory, which is /user in HDFS.
hdfs dfs -mkdir -p /user/$(whoami)
(You should also chown and chmod that directory)
Then, you can place files into a twitter_data directory.
hdfs dfs -mkdir twitter_data
hdfs dfs -put <local_files> twitter_data
(I removed hadoop_store/hdfs/namenode because that doesn't make sense)

No such file or directory in copying file to hadoop

i'm beginner in hadoop, when i use
Hadoop fs -ls /
And
Hadoop fs - mkdir /pathname
Every thing is ok, but i want to use my csv file in hadoop, my file is in c drive, i used -put and wget and copyfromlocal commands like these:
Hadoop fs -put c:/ path / myhadoopdir
Hadoop fs copyFromLoacl c:/...
Wget ftp://c:/...
But in two of above it errors in no such file or directory /myfilepathinc:
And for the third
Unable to resolve host address"c"
Thanks for your help
Looking at your command, it seems that there could be couple of reasons for this issue.
Hadoop fs -put c:/ path / myhadoopdir
Hadoop fs copyFromLoacl c:/...
Use hadoop fs -copyFromLocal correctly.
Check your local file permission. You have to give full access to that file.
You have to give your absolute path location both in local and in hdfs.
Hope it will work for you.
salmanbw's answer is exact. To be more clear.
Suppose your file is "c:\testfile.txt", use the command below.
And also make sure you have write permission to your directory in HDFS.
hadoop fs -copyFromLocal c:\testfile.txt /HDFSdir/testfile.txt

hadoop fs -get not working in ubuntu

I have created single node hadoop cluster in Ubuntu .
I was trying to copy file from hdfs to local fs but when i issued command
hduser#ubuntu:/usr/local/hadoop/bin$ hadoop fs -get /user/hduser/Employee /home/output/
I got a message
get: No such file or directory
How to fix this?
The general format for hadoop shell command get shown below,
hadoop fs -get <HDFS File> <local File Directory>
You have used as, hduser#ubuntu:/usr/local/hadoop/bin$ hadoop fs -get /user/hduser/Employee /home/output/ in here /user/hduser/Employee is an directory not a file.
you should do as,
hduser#ubuntu:/usr/local/hadoop/bin$ hadoop fs -get /user/hduser/Employee[/FILENAME] /home/output/
Else If you want to copy a directory(i.e folder) you can use dfs -copyToLocal,
hduser#ubuntu:/usr/local/hadoop/bin$ hadoop dfs -copyToLocal /user/hduser/Employee /home/output/
You can find Hadoop Shell Commands here.
You need to make sure that /user/hduser is a directory and not a file.I once had this problem and I tried hadoop fs -ls which showed
-rwx r-x -r-x
A directory would be drwx r-x r-x .
If this is the problem you need to remove it using -rmr /user/hduser and make it again with -mkdir.
Other options,try -copyToLocal or try downloading the file from HDFS webportal i.e. namenode_IP:50070

Copying local files to Hadoop but couldn't succeed

I tried to copy a local file from hard disk directly to Hadoop directory and got the following results. None of them work. Can anyone help me with the right syntax?
$ hadoop fs -copyFromLocal C:\\\temp\\\sample_file.txt /user/user_name/sample_file.txt
**copyFromLocal: unexpected URISyntaxException**
$ hadoop fs -copyFromLocal C://temp//sample_file.txt /user/user_name /sample_file.txt
**copyFromLocal: `//sample_file.txt': No such file or directory**
$ hadoop fs -copyFromLocal C:\temp\sample_file.txt /user/user_name/sample_file.txt
**-copyFromLocal: Can not create a Path from a null string Usage: hadoop fs [generic options] -copyFromLocal [-f] [-p] ...**
$ hadoop fs -copyFromLocal C:/temp/sample_file.txt /user/user_name/sample_file.txt
**copyFromLocal: `/temp/sample_file.txt': No such file or directory**
One problem I've encountered when doing a fs -copyFromLocal /localpath/localfile /hdfspath/hdfsfile is when the target /hdfspath doesn't exist.
I usually create the entire target hdfspath first: fs -mkdir /hdfspath, then issue the -copyFromLocal command.
Beyond that, take a look at this bug:
https://issues.apache.org/jira/browse/HADOOP-10030 which seems to explain your first error.
Perhaps your version of hadoop/hdfs doesn't have this fix.
Also according to this blog (http://www.hindoogle.com/blog/2009/12/cygwin-vs-hadoop-dfs-copy/) your first syntax seems to work for them:
$ bin/hadoop dfs -copyFromLocal C:\\cygwin\\home\\Rajat\\java\\mrh\\input input

Hadoop 2.2 Installation `.' no such file or directory

I have installed Hadoop and HDFS using this tutorial
http://codesfusion.blogspot.com/2013/10/setup-hadoop-2x-220-on-ubuntu.html
Everything is fine.
I am also able to create directories and use them using
hadoop fs -mkdir /tmp
hadoop fs -mkdir /small
I can also say
hadoop fs -ls /
However I am following a tutorial in which the trainer does
hadoop fs -mkdir temp
hadoop fs -ls
now on my machine when I issue the above command it says
ls: `.': No such file or directory
In my training video the command hadoop fs -ls works perfectly. Why should I specify the "/"?
Also I am getting this warning in all my commands
13/12/28 20:23:52 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
But in my trainers video there is no such warning.
My configuration file is exactly as the tutorial above and I can also see all management UIs at
http://abhishek-pc:8042/
http://abhishek-pc:50070/
http://abhishek-pc:8088/
So my question is what is wrong with my configuration and why is my system behaving differently than the training video?
Well, your problem regarding ls: '.': No such file or directory' is because, there is no home dir on HDFS for your current user. Try
hadoop fs -mkdir -p /user/[current login user]
Then you will be able to hadoop fs -ls
As per this warning WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable, please see my answer at this question
First:
hdfs dfs -mkdir /user
then perform
hdfs dfs -mkdir /user/hduser
Solved this. Run hadoop fs -ls as hdfs user(not as root user). #su - hdfs.
I faced a similar type of problem during tutorial by hadoop form link-
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/SingleCluster.html
when i tried command - bin/hdfs dfs -put etc/hadoop input , it says
mkdir: `input': No such file or directory
then problem solved by adding extra / to input and command should be -
bin/hdfs dfs -put etc/hadoop /input
This could also happen due to bad carriage return characters. Run `dos2unix' on all your hdfs executable (shell script) and if required all other related shell scripts as well.
First of all when you want for the first time to put something in your HDFS you should do this steps
hdfs fs -mkdir -p /user/nameuser(the name of user )
hdfs fs -put ~/file
after hdfs dfs -mkdir /user/[user name]
do :
hadoop fs -ls /
it's works form me !

Resources