I am executing following command:
hadoop fs -copyFromLocal /tmp/temp/pattern_BS.conf hdfs://wihadoopn301p.prod.ch3.s.com:/user/hdfs/hadoop/qa2/BS/
In this I am trying to copy pattern_BS.conf in /tmp/temp folder on local drive, into hdfs://wihadoopn301p.prod.ch3.s.com:/user/hdfs/hadoop/qa2/BS/ location.
But it giving following error:
copyFromLocal: For input string: ""
Usage: java FsShell [-copyFromLocal <localsrc> ... <dst>]
Please help me out in solving this problem.
I think you should use the command as given below, because copyFromLocal expects first argument to be a local file and the second argument to be a location in hdfs
hadoop fs -copyFromLocal /tmp/temp/pattern_BS.conf /user/hdfs/hadoop/qa2/BS/
Related
I am currently working on a project for one of my lectures at the university. The task is to download a book from https://www.gutenberg.org/ and copy it into HDFS. I've tried using put <localSrc> <dest> but it didnt work at all.
This is how my code looks in Terminal at the moment:
[cloudera#quickstart ~]$ put <pg16328.txt> <documents>
bash: syntax error near unexpected token `<'
Any help is appreciated. Thanks in advance.
UPDATE 30.05.2017: I haved used following link https://www.cloudera.com/downloads/quickstart_vms/5-10.html to install Hadoop and did not configure anything at all. Only thing I did was to absolve the tutorial Getting started.
It should just be:
hdfs fs -copyFromLocal pg16328.txt /HDFS/path
I'm not familiar with the put command, but have you tried it without the <>s?
If you have successfully extracted and configured Hadoop, then
you should be in hadoop-home directory ( the location where you extracted and configured hadoop)
Then apply the following command
bin/hadoop dfs -put <local file location> <hdfs file location>
or
bin/hdfs dfs -put <local file location> <hdfs file location>
You can do the same with -copyFromLocal command too. Just replace -put with -copyFromLocal in above commands.
for example :
Lets say you have pg16328.txt in your Desktop directory, then the above command would be
bin/hadoop dfs -put /home/cloudera/Desktop/pg16328.txt /user/hadoop/
where /user/hadoop is a directory in hdfs
If /user/hadoop directory doesn't exists then you can create it by
bin/hadoop dfs -mkdir -f /user/hadoop
You can look at the uploaded file using webUI (namenodeIP:50070) or by using command line as
bin/hadoop dfs -ls /user/hadoop/
i'm beginner in hadoop, when i use
Hadoop fs -ls /
And
Hadoop fs - mkdir /pathname
Every thing is ok, but i want to use my csv file in hadoop, my file is in c drive, i used -put and wget and copyfromlocal commands like these:
Hadoop fs -put c:/ path / myhadoopdir
Hadoop fs copyFromLoacl c:/...
Wget ftp://c:/...
But in two of above it errors in no such file or directory /myfilepathinc:
And for the third
Unable to resolve host address"c"
Thanks for your help
Looking at your command, it seems that there could be couple of reasons for this issue.
Hadoop fs -put c:/ path / myhadoopdir
Hadoop fs copyFromLoacl c:/...
Use hadoop fs -copyFromLocal correctly.
Check your local file permission. You have to give full access to that file.
You have to give your absolute path location both in local and in hdfs.
Hope it will work for you.
salmanbw's answer is exact. To be more clear.
Suppose your file is "c:\testfile.txt", use the command below.
And also make sure you have write permission to your directory in HDFS.
hadoop fs -copyFromLocal c:\testfile.txt /HDFSdir/testfile.txt
I run Wordcount in Eclipse and my text file exists in hdfs
Eclipse shows me this error:
Input path does not exist: file:/home/hduser/workspace/sample1user/hduser/test1
Input path does not exist: file:/home/hduser/workspace/sample1user/hduser/test1
Your error shows that the wordcount is searching for the file in local filesystem and not in hdfs. Try copying the input file in local file system.
Post the results for following commands in your question:
hdfs dfs -ls /home/hduser/workspace/sample1user/hduser/test1
hdfs dfs -ls /home/hduser/workspace/sample1user/hduser
ls -l /home/hduser/workspace/sample1user/hduser/test1
ls -l /home/hduser/workspace/sample1user/hduser
I too ran into the similar issue..(I am a beginner too) I gave the full hdfs path via Arguments for the Wordcount program like below and it worked (I was running the Pseudo-Distributed mode)
hdfs://krishl#localhost:9000/user/Perumal/Input hdfs://krish#localhost:9000/user/Perumal/Output
hdfs://krish#localhost:9000 is my hdfs location and My hadoop daemons were running during the testing.
Note: This may not be the best practice but it helped me get started!!
I tried to copy a local file from hard disk directly to Hadoop directory and got the following results. None of them work. Can anyone help me with the right syntax?
$ hadoop fs -copyFromLocal C:\\\temp\\\sample_file.txt /user/user_name/sample_file.txt
**copyFromLocal: unexpected URISyntaxException**
$ hadoop fs -copyFromLocal C://temp//sample_file.txt /user/user_name /sample_file.txt
**copyFromLocal: `//sample_file.txt': No such file or directory**
$ hadoop fs -copyFromLocal C:\temp\sample_file.txt /user/user_name/sample_file.txt
**-copyFromLocal: Can not create a Path from a null string Usage: hadoop fs [generic options] -copyFromLocal [-f] [-p] ...**
$ hadoop fs -copyFromLocal C:/temp/sample_file.txt /user/user_name/sample_file.txt
**copyFromLocal: `/temp/sample_file.txt': No such file or directory**
One problem I've encountered when doing a fs -copyFromLocal /localpath/localfile /hdfspath/hdfsfile is when the target /hdfspath doesn't exist.
I usually create the entire target hdfspath first: fs -mkdir /hdfspath, then issue the -copyFromLocal command.
Beyond that, take a look at this bug:
https://issues.apache.org/jira/browse/HADOOP-10030 which seems to explain your first error.
Perhaps your version of hadoop/hdfs doesn't have this fix.
Also according to this blog (http://www.hindoogle.com/blog/2009/12/cygwin-vs-hadoop-dfs-copy/) your first syntax seems to work for them:
$ bin/hadoop dfs -copyFromLocal C:\\cygwin\\home\\Rajat\\java\\mrh\\input input
I am trying to copy a set of csv files from my hard drive into a hadoop fs, but I am receiving a Syntax error when I excute the following code:
'# hadoop fs -put 'C:\myfolder\myfile.csv' /user/root/
put: unexpected URISyntaxException
Is this not the correct syntax?
Try:
hadoop fs -copyFromLocal C:\\myfolder\\myfile.csv /user/root/
or
hadoop fs -copyFromLocal C:/myfolder/myfile.csv /user/root/
--NOTE--
Only Full path will work.
I never used put for that...
Try:
hadoop fs -copyFromLocal C:\myfolder\myfile.csv myfile.csv
In Linux env Both syntax should work
bin/hadoop fs -put /home/username/Documents/test.csv /usr/test1.csv
bin/hadoop fs -copyFromLocal /home/naveen/username/test.csv /usr/test1.csv
The difference between put and copyFromLocal is
in copyFromLocal the source is restricted to a local file reference
Refer this url for shell commands
http://hadoop.apache.org/docs/r0.18.3/hdfs_shell.html#copyFromLocal