I've been starting NameNode and DataNode, but when I try to use HDFS command to make a directory(in any place), it doesn't work.
Here is my command:
./hdfs dfs -mkdir -p /usr/master/datas
and I also trying to change the format of my path:
./hdfs dfs -mkdir -p "/usr/master/datas"
but I get same result.
I'm just starting to learn big-data. Can anyone tell me how to fix this issue and how debug the issue?
/usr doesn't exist on HDFS. That's a Unix directory.
The user directory in HDFS is /user.
Plus, you need to be an HDFS superuser to create HDFS folders under the root path, or at least folders not owned by the current user.
Related
I am trying to upload a file in HDFS with:
sudo -u hdfs hdfs dfs -put /home/hive/warehouse/sample.csv hdfs://[ip_redacted]:9000/data
I can confirm that HDFS works, as I managed to create the /data directory just fine.
Even giving the full path to the .csv file gives the same error:
put: `/home/hive/warehouse/sample.csv': No such file or directory
Why is it giving this error?
I encountered the problem, too.
Because user hdfs has no permission to access one of the file's ancestry directories, so it gave the error No such file or directory.
As crystyxn commentted, using environment variable HADOOP_USER_NAME instead of sudo -u hdfs worked.
Is the csv file in your local system or in HDFS? You can use -put command (or the -copyFromLocal command) ONLY to move a LOCAL file into the distributed file system.
I have created a directory in dfs called /foodir to test, as below:
hadoop dfs -mkdir /foodir
Can someone tell me where is this /foodir saved? How can I check the path? I need to make sure, it is not saved under localfile system /tmp because, everytime server is rebooted /tmp is deleted.
Any ideas how to check the /foodir path in the server file system?
This depends how you setup your core-site.xml and hdfs-site.xml files...
If fs.defaultFS is not set to a file:// path (the default), then your local /tmp is not touched
If your datanode and namenode data directories are not set to your local /tmp (also the default), then nothing is stored there either
You can explicitly make a HDFS path via
hdfs dfs -mkdir hdfs://namenode.fqdn:port/foodir
Otherwise, just run ls /tmp and check if there's files there you made
I am able to create directory using the below command but not able to create the subdir under already created dir. May I know what could be the reason. I have setup hdfs on my mac in pseudo distributed mode and trying to create these directories. Any help would be appreciated.
hadoop fs -mkdir /test/subdir
The above command doesn't create any sub directory however the below command creates a directory.
hadoop fs -mkdir test
To recursively create subdirectories inside parent directory, you have to provide -p option or else you can create one directory at a time.
hdfs dfs -mkdir -p /test/subdir
will work in your case.
Try giving it the parent creation flag.
hadoop fs -mkdir -p /test/subdir
I use Windows 8 with a cloudera-quickstart-vm-5.4.2-0 virtual box.
I downloaded a text file as words.txt into the Downloads folder.
I changed directory to Downloads and used hadoop fs -copyFromLocal words.txt
I get the no such file or directory error.
Can anyone explain me why this is happening / how to solve this issue?
Here is a screenshot of the terminal:
Someone told me this error occurs when Hadoop is in safe mode, but I have made sure that the safe mode is OFF.
It's happening because hdfs:///user/cloudera doesn't exist.
Running hdfs dfs -ls probably gives you a similar error.
Without specified destination folder, it looks for ., the current HDFS directory for the UNIX account running the command.
You must hdfs dfs -mkdir "/user/$(whoami)" before your current UNIX account can use HDFS, or you can specify an otherwise existing HDFS location to copy to
I have just installed a standalone cluster on my laptop. On running the hdfs dfs -ls command in a terminal, I get to see a list of folders. Upon searching the local file system through the File Explorer window I couldn't locate those files in my file system.
rishirich#localhost:/$ hdfs dfs -ls
Found 1 items
drwxr-xr-x - rishirich supergroup 0 2017-11-09 03:32 user
This folder named 'user' was nowhere to be seen on the local filesystem. Is it that the folder is hidden?
If so, then what terminal command should I use in order to find this folder?
If not, then how do I locate it?
You can't see the hdfs directory structure in graphical view to view it you have to use your terminal only.
hdfs dfs -ls /
and to see local file directory structure in the terminal you should try
ls <path>
cd <path>
cd use to change the directory in terminal.
In your installation of Hadoop, you had set up a core-site.xml file to establish the fs.defaultFS property. If you did not make this file://, it will not be the local filesystem.
If you set it to hdfs://, then the default locations for the namenode and datanode directories are in your local /tmp folder.
Note - those are HDFS blocks, not whole, readable files stored in HDFS.
If you want to list your local filesystem, you're welcome to use hadoop fs -ls file://