hadoop 2.7.2 HDFS: no such file or directory - hadoop

I have this:
I had also tried to edit this:
export HADOOP_OPTS="-Djava.library.path=$HADOOP_INSTALL/lib
as
export HADOOP_OPTS="$HADOOP_OPTS-Djava.library.path=$HADOOP_INSTALL/lib
in ~/.bashrc
But still I am getting a warning message and I'm not able to solve the problem.
Unable to create the directory
I'm using this code to create the directory for twitter analysis:
hadoop fs -mkdir hdfs://localhost:54310/home/vipal/hadoop_store/hdfs/namenode/twitter_data

Notice how hadoop fs -ls says .: No such file or directory?
First, you must create your home directory, which is /user in HDFS.
hdfs dfs -mkdir -p /user/$(whoami)
(You should also chown and chmod that directory)
Then, you can place files into a twitter_data directory.
hdfs dfs -mkdir twitter_data
hdfs dfs -put <local_files> twitter_data
(I removed hadoop_store/hdfs/namenode because that doesn't make sense)

Related

Copying files into HDFS Hadoop

I am currently working on a project for one of my lectures at the university. The task is to download a book from https://www.gutenberg.org/ and copy it into HDFS. I've tried using put <localSrc> <dest> but it didnt work at all.
This is how my code looks in Terminal at the moment:
[cloudera#quickstart ~]$ put <pg16328.txt> <documents>
bash: syntax error near unexpected token `<'
Any help is appreciated. Thanks in advance.
UPDATE 30.05.2017: I haved used following link https://www.cloudera.com/downloads/quickstart_vms/5-10.html to install Hadoop and did not configure anything at all. Only thing I did was to absolve the tutorial Getting started.
It should just be:
hdfs fs -copyFromLocal pg16328.txt /HDFS/path
I'm not familiar with the put command, but have you tried it without the <>s?
If you have successfully extracted and configured Hadoop, then
you should be in hadoop-home directory ( the location where you extracted and configured hadoop)
Then apply the following command
bin/hadoop dfs -put <local file location> <hdfs file location>
or
bin/hdfs dfs -put <local file location> <hdfs file location>
You can do the same with -copyFromLocal command too. Just replace -put with -copyFromLocal in above commands.
for example :
Lets say you have pg16328.txt in your Desktop directory, then the above command would be
bin/hadoop dfs -put /home/cloudera/Desktop/pg16328.txt /user/hadoop/
where /user/hadoop is a directory in hdfs
If /user/hadoop directory doesn't exists then you can create it by
bin/hadoop dfs -mkdir -f /user/hadoop
You can look at the uploaded file using webUI (namenodeIP:50070) or by using command line as
bin/hadoop dfs -ls /user/hadoop/

Bash unable to create directory

In docker, I want to copy a file README.md from an existing directory /opt/ibm/labfiles to a new one /input/tmp. I try this
hdfs dfs -put /opt/ibm/labfiles/README.md input/tmp
to no effect, because there seems to be no /input folder in the root. So I try to create it:
hdfs dfs -mkdir /input
mkdir:'/input': File exists
However, when I ls, there is no input file or directory
How can I create a folder and copy the file? Thank you!!
Please try hdfs dfs -ls / if you want to see there is an input folder that exists in HDFS at the root.
You cannot cd into an HDFS directory
It's also worth mentioning that the leading slash is important. In other words,
This will try to put the file in HDFS at /user/<name>/input/tmp
hdfs dfs -put /opt/ibm/labfiles/README.md input/tmp
While this puts the file at the root of HDFS
hdfs dfs -put /opt/ibm/labfiles/README.md /input/tmp

How to create folder in hadoop in year/date/time

i have problem, how create folder in hadoop but name path folder in year,date,time.?
example:
i want path folder:
/user/hdfs/2015/10/10/0000
i try my code:
hadoop fs -mkdir /user/hdfs/2015/10/10/0000
but i have error,
No such file or directory.
How i get path folder using hadoop fs -mkdir like /user/hdfs/2015/10/10/0000.?
Thanks.
Maybe run :
hadoop fs -mkdir -p /user/hdfs/2015/10/10/0000
The -p option will create all the directories in the path as needed. See https://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/FileSystemShell.html#mkdir for more information.

hadoop fs -get not working in ubuntu

I have created single node hadoop cluster in Ubuntu .
I was trying to copy file from hdfs to local fs but when i issued command
hduser#ubuntu:/usr/local/hadoop/bin$ hadoop fs -get /user/hduser/Employee /home/output/
I got a message
get: No such file or directory
How to fix this?
The general format for hadoop shell command get shown below,
hadoop fs -get <HDFS File> <local File Directory>
You have used as, hduser#ubuntu:/usr/local/hadoop/bin$ hadoop fs -get /user/hduser/Employee /home/output/ in here /user/hduser/Employee is an directory not a file.
you should do as,
hduser#ubuntu:/usr/local/hadoop/bin$ hadoop fs -get /user/hduser/Employee[/FILENAME] /home/output/
Else If you want to copy a directory(i.e folder) you can use dfs -copyToLocal,
hduser#ubuntu:/usr/local/hadoop/bin$ hadoop dfs -copyToLocal /user/hduser/Employee /home/output/
You can find Hadoop Shell Commands here.
You need to make sure that /user/hduser is a directory and not a file.I once had this problem and I tried hadoop fs -ls which showed
-rwx r-x -r-x
A directory would be drwx r-x r-x .
If this is the problem you need to remove it using -rmr /user/hduser and make it again with -mkdir.
Other options,try -copyToLocal or try downloading the file from HDFS webportal i.e. namenode_IP:50070

hadoop dfs -ls complains

Can anyone let me know what seems to be wrong here ? hadoop dfs command seems to be OK but any following options are not recognized.
[hadoop-0.20]$bin/hadoop dfs -ls ~/wordcount/input/
ls: Cannot access /home/cloudera/wordcount/input/ : No such file or directory
hadoop fs -ls /some/path/here - will list a HDFS location, not your local linux location
try first this command
hadoop fs -ls /
then investigate step by step other folders.
if you want to copy some files from local directory to users directory on HDFS location, then just use this:
hadoop fs -mkdir /users
hadoop fs -put /some/local/file /users
for more hdfs commands see this: http://hadoop.apache.org/common/docs/r0.20.0/hdfs_shell.html
FS relates to a generic file system which can point to any file systems like local, HDFS, s3 etc But dfs is very specific to HDFS. So when we use FS it can perform operation with from/to local or hadoop distributed file system to destination. But specifying DFS operation relates to HDFS.

Resources