While trying to copy a file from local disk to hdfs it shows error even though the syntax is proper. It says no such file or directory even though the file physically does exist.
What should I do? I have tried all the 3 commands to transfer/copy the file.
hadoop fs -put /Users/Sneha/Desktop/bank.xlsx /user/gulafsha_parveen_simplilearn/myproject
Shows Error:
no such file for /Users/Sneha/Desktop/bank.xlsx
I think one good way to troubleshoot will be to do an ls with the same user. Something like below.
>>$ ls /Users/Sneha/Desktop/bank.xlsx
Hope the output will make things clear.
Related
Does hadoop filesystem shell moving of empty directory?
Assume that I have a below directory which is empty.
hadoop fs -mv /user/abc/* /user/xyz/*
When I am executing the above command , it is giving me the error
'/user/abc/*' does not exists.
However, If I put some data inside /user/abc/* , it is getting executed successfully.
Does anyone know how to handle for empty directory?
Is there any alternative to execute above command without giving error?
hadoop fs -mv /user/abc/* /user/xyz
The destination file doesn't need to add /*
I thinks you want to rename the file.
you also can use this ->
hadoop fs -mv /user/abc /user/xyz
Because you xyz file is empty,so you don't got error.
but if you xyz file has many file,you will get error as well.
This answer should be correct I believe.
hadoop fs -mv /user/abc /user/xyz
'*' is a wild card. So it's looking for any file inside the folder. When nothing found, it returns the error.
As per the command,
When you move a file, all links to otherfiles remain intact, except when youmove it to a different file system.
I use Windows 8 with a cloudera-quickstart-vm-5.4.2-0 virtual box.
I downloaded a text file as words.txt into the Downloads folder.
I changed directory to Downloads and used hadoop fs -copyFromLocal words.txt
I get the no such file or directory error.
Can anyone explain me why this is happening / how to solve this issue?
Here is a screenshot of the terminal:
Someone told me this error occurs when Hadoop is in safe mode, but I have made sure that the safe mode is OFF.
It's happening because hdfs:///user/cloudera doesn't exist.
Running hdfs dfs -ls probably gives you a similar error.
Without specified destination folder, it looks for ., the current HDFS directory for the UNIX account running the command.
You must hdfs dfs -mkdir "/user/$(whoami)" before your current UNIX account can use HDFS, or you can specify an otherwise existing HDFS location to copy to
I tried to install hadoop using the below link.
"http://www.bogotobogo.com/Hadoop/BigData_hadoop_Install_on_ubuntu_single_node_cluster.php"
I was moving the files to /usr/local/hadoop.But i got the following error.
hduser#vijaicricket-Lenovo-G50-70:~$ ~/hadoop-2.6.0$ sudo mv * /usr/local/hadoop
-bash: /home/hduser/hadoop-2.6.0$: No such file or directory
Where did you extract the hadoop tar file? As looking at the error from shell looks like " /home/hduser/hadoop-2.6.0" directory doesnt exist. Also make sure the valid permission user has.
I'm currently working on a project that makes use of hadoop (2.7.0) I have a two node cluster configured and working (for the most part). I can run mapper / reducer jobs manually withoud any problems. But when I try to start a job with hadoopy I get a error. After debugging the error I see it origionates from the following command that is executed by hadoopy:
hadoop fs -mkdir _hadoopy_tmp
This yields the error:
mkdir: '_hadoopy_tmp': No such file or directory
When doing it manually mkdir works fine if I start my file direcotry name with a '/' in front of it. If I don't start with the '/' I get the same error as above. Same goes with the ls command (ls / gives me a result, ls . gives me a error that there is no such file or directory). I'm guessing that I screwed up in the configuration of hadoop somewhere. I just cant figure out where.
EDIT: to clearify: I'm aware that you should use the mkdir command with a direct path (ea / in front of it). When interacting with hadoop trough the terminal I do this. However the hadoopy framework seems not to do it (it throws the error as shown above). my question is: is there a fix/workaround for this in hadoopy, or do I have to rewrite there source code?
I don't understand what is 'manually' for you, but the errors that you are seeing makes perfect sense in my opinion, if you want to create a directory in hadoop FS, you should give the exact path to do it. There isn't problem there, and you didn't screw up anything. I recommend you to do it in this way:
$HADOOP_HOME/bin/hdfs dfs -mkdir /name_of_new_folder/
Pd: I don't know anything of hadoopy, i'm just talking from my experience with hadoop (and some items should be equally handled in both, so that's the reason why i'm answering here, please correct my if i'm wrong)
Created a folder [LOAN_DATA] with below command
hadoop fs -mkdir hdfs://masterNode:8020/tmp/hadoop-hadoop/dfs/LOAN_DATA
Now using the web UI when I list the contents of directory /tmp/hadoop-hadoop/dfs, it shows LOAN_DATA.
But when I want to store some Data from a TXT file to the LOAN_DATA directory using put or copyFromLocal I get
put: Unknown command
Command used:
hadoop fs –put '/home/hadoop/my_work/Acquisition_2012Q1.txt' hdfs://masterNode:8020/tmp/hadoop-hadoop/dfs/LOAN_DATA
How to resolve this issue?
This issue may occur when you copy-paste a command and use it. It is because of the change in font (or character set) used in the document from where it was copied.
For example:
If you copy/paste and execute the command -
hdfs dfs -put workflow.xml /testfile/workflow.xml
You may get-
–put: Unknown command
OR
–p-t: Unknown command
This happens because the copy is done from a UTF-8 file and the - or u (or any of the characters) copied may be of different character set.
So just type the command on the terminal (don't copy/paste) and you should be fine.
Alternatively, if you are running a shell script which was copied from
some other editor then run a dos2unix on the script before running it
on the Linux terminal.
Eg: dos2unix <shell_script.sh>
Tried your command and "it appears", there is a typo error in the above command 'hadoop fs –put ....'.
Instead of '–put', use '-put' or '-copyFromLocal'. Problem is with '–' but the correct character should be '-'. As such, the error is obvious :-)
Here is my example (using a get command instead of put):
$ hadoop fs –get /tmp/hadoop-data/output/* /tmp/hadoop-data/output/
–get: Unknown command
$ hadoop fs -get /tmp/hadoop-data/output/* /tmp/hadoop-data/output/
get: `/tmp/hadoop-data/output/part-r-00000': File exists
Anand's answer is, of course, correct. But it might not have been a typo but rather a subtle trap. Often when people are learning new technology, they copy and paste commands from websites and blogs. Often, what was originally entered as a dash will be copied as a hyphen. Hyphens differ from dashes only in that they are a tad longer, so the mistake is hard to spot, but since they are a completely different character the command is wrong, that is, "not found".