I am unable to put file into the HDFS. Whenever I try to execute put command, I receive a permission denied error. I have tried giving all read write execute permissions to the input file, but the problem still stands.
This is the command I executed. I am currently in hduser which has hadoop installed in it:
hadoop dfs -put /home/hduser/input /
The error I receive is the following:
WARNING: Use of this script to execute dfs is deprecated. WARNING:
Attempting to execute replacement "hdfs dfs" instead. put:
/input._COPYING_ (Permission denied)
According to the documentation of the put command you should use it like the following:
hadoop fs -put /path/to/localfile /home/hduser/input
where:
/path/to/localfile is the path on local FS where is the file you want to put on HDFS
/home/hduser/inputis the HDFS destination folder path
Related
I'm new to Hadoop, and am trying to check what data is available in HDFS. However, the dfs command returns a response that indicates the class is deprecated, and that hdfs should be used:
-bash-4.2$ hadoop dfs -ls
DEPRECATED: Use of this script to execute hdfs command is deprecated.
Instead use the hdfs command for it.
ls: `.': No such file or directory
When I try the hdfs command, though, I get what appears to be a Java class lookup error:
-bash-4.2$ hadoop hdfs -ls
Error: Could not find or load main class hdfs
Is there something wrong with my Hadoop setup, or have others encountered this catch-22?
It is hadoop fs or hdfs dfs, then -ls
You can run hdfs dfs -ls / to check the root of HDFS, but you will get .: No such file or directory because the output of echo "hdfs:///user/$(whoami)" does not exist yet, and you need to make it using hadoop fs -mkdir -p hdfs:///user/$(whoami).
That command must be repeated for every user account that attempts to access their HDFS user directory
And I checked the webUI which show the datanodes in the unhealthy status. I do not know why this happen.
This is because your configuration or any abnormal termination of datanode(While doing any action on that node)
There is no internal problem with hdfs dfs -put , just verify whats inside your directory or use command
hdfs dfs -ls /
Please specify your problem an error cant be a problem statement until you dont know what you are trying to do.
File permission issue.
Check file permissions of dfs directory:
find /path/to/dfs -group root
In general, the user permission group is hdfs.
Since I started HDFS service with root user, some dfs block file with root permissions was generated.
I solved the problem after change to right permissions:
sudo chown -R hdfs:hdfs /path/to/dfs
I have a local VM that has Hortonworks Hadoop and hdfs installed on it. I ssh'ed into the VM from my machine and now I am trying to copy a file from my local filesystem into hdfs through following set of commands:
[root#sandbox ~]# sudo -u hdfs hadoop fs -mkdir /folder1/
[root#sandbox ~]# sudo -u hdfs hadoop fs -copyFromLocal /root/folder1/file1.txt /hdfs_folder1/
When I execute it I get following error as - copyFromLocal:/root/folder1/file1.txt': No such file or directory
I can see that file right in /root/folder1/ directory but with hdfs command its throwing above error. I also tried to cd to /root/folder1/ and then execute the command but same error comes. Why is the file not getting found when it is right there?
By running sudo -u hdfs hadoop fs..., it tries to read the file /root/folder1/file.txt as hdfs.
You can do this.
Run chmod 755 -R /root. It will change permissions on directory and file recursively. But it is not recommended to open up permission on root home directory.
Then you can run the copyFromLocal as sudo -u hdfs to copy file from local file system to hdfs.
Better practice is to create user space for root and copy files directly as root.
sudo -u hdfs hadoop fs -mkdir /user/root
sudo -u hdfs hadoop fs -chown root:root /user/root
hadoop fs -copyFromLocal
I had the same problem running a Hortonworks 4 node cluster. As mentioned, user "hdfs" doesn't have permission to the root directory. The solution is to copy the information from the root folder to something the "hdfs" user can access. In the standard Hortonworks installation this is /home/hdfs
as root run the following...
mkdir /home/hdfs/folder1
cp /root/folder1/file1.txt /home/hdfs/folder1
now change users to hdfs and run from the hdfs USER's accessible directory
su hdfs
cd /home/hdfs/folder1
now you can access files as the hdfs user
hdfs dfs -put file1.txt /hdfs_folder1
I am getting error while copying files from local file system to hdfs,
will you please help me regarding this,
I am using this command :
hadoopd fs -put text.txt file
put and copyFromLocal command helps you to copy data from your local system to HDFS,provided you have the permission to do so.
hadoop fs -put /path/to/textfile /path/to/hdfs
OR
hadoop dfs -put /path/to/textfile /path/to/hdfs
Comming to your error:
You typed the above command as
hadoopd fs
use
hadoop dfs -put /text.txt /file
hadoop dfs -put /path/to/local/file /path/to/hdfs/file
You can use following command
hadoop fs -copyFromLocal text.txt <path_to_hdfs_directory_where_you_want_to_keep_text.txt>
Without knowing the specific error you are getting, it's difficult to answer. The other responders posted the proper syntax. However, it is not uncommon to see permission issues when attempting to copy files to HDFS.
By default the user and group are typically "hdfs" and "supergroup". Your user account likely doesn't belong to "supergroup" and will get permission denied errors. Try running the command as:
sudo -u hdfs hadoop fs -put /path/to/local/file /path/to/hdfs/file
or
sudo -u hdfs hadoop dfs -put /path/to/local/file /path/to/hdfs/file
You can get around having to do this by changing the ownership and permission of the destination directory on HDFS to be more permissive.
"DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/hduser/myfile could only be replicated to 0 nodes, instead of 1 at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getAdditionalBlock". From this I thinrk your data node is not running/properly. Check that in cluster UI.Then try
hadoop dfs -put /path/file /hdfs/file (hadoop YARN)
hadoop fs -copyFromLocal /path/file /hdfs/file (hadoop1.x)
sudo -u hdfs hadoop fs -copyFromLocal input.csv input.csv
copyFromLocal: `input.csv': No such file or directory
Can anyone tell me the exact reason why I am getting this kind of error? I gave all permissions to the input.csv file and I even changed the owner to hdfs. I am new to Hadoop and Hbase.
In this case you are trying to read the file as the hdfs user, which may not have permission to view this file. To test, do this:
sudo -u hdfs cat input.csv
If you get permission denied, you either need to change the permissions of this file so the hdfs user can read it (or if it already has read permissions, move the file to a directory that the hdfs user can read), or use a different user that has permission to access the local and the remote directories/files.
You need to make sure that user hdfs has read permission to all the parent directories of input.csv along the path.
syntax: hadoop dfs -copyFromLocal completelocalfilesystempath hdfspath
Example: Let input.csv be in localpath /usr/examples and my hdfs path where it needs to be copied is /usr/input so the command will be
hadoop dfs -copyFromLocal /usr/examples/input.csv /usr/input/
To copy files you can run it like this:
cat input.csv | sudo -u hdfs hadoop fs -put - input.csv