Hadoop file system commands not found - hadoop

I have installed Hadoop 2.6.0 on my laptop which runs Ubuntu 14.04 lts.
Below is the link I followed for Hadoop installation: https://github.com/ev2900/YouTube_Vedio/blob/master/Install%20Hadoop%202.6%20on%20Ubuntu%2014.04%20LTS/Install%20Hadoop%202.6%20--%20Virtual%20Box%20Ubuntu%2014.04%20LTS.txt
After installation, I ran two commands:
hadoop namenode -format - It works fine
hadoop fs -ls - It is giving the following error
15/11/15 16:15:28 WARN
util.NativeCodeLoader: Unable to load native-hadoop library for your
platform... using builtin-java classes where applicable
ls: `.': No such file or directory
help me solve the error.

15/11/15 16:15:28 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable is a perpetual annoyance and not an error, so don't worry about that.
The ls: '.': No such file or directory error means that you haven't made your home directory yet, so you're trying to ls on a folder that doesn't exist. Do the following (as HDFS root user) to create your home folder. Ensure it has the correct permissions (which I guess depends on what specifically you want to do re: groups etc):
hdfs dfs -mkdir -p /user/'your-username'

Related

How can i solve hadoop error while installing on mac terminal?

I am getting this error in hadoop while trying to create a dir.
2023-02-07 23:43:38,731 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: `/BigDataFirstName': Input/output error
please explain me what to do step by step if possible.
i tried some stuff from internet but didnt work.
like
$ hadoop-daemon.sh stop namenode
$ hadoop-daemon.sh stop datanode
and then start it again.
The first line, you can ignore
The second line simply says you're unable to create that directory. You'll need to first format the namenode using hdfs namenode -format, if you haven't, and only then should you start the namenode, then datanode.
If you stop your machine without stopping the namenode cleanly, it may become corrupt and not start correctly, causing other issues when trying to use hdfs commands

I have a error with hdfs hadoop file and Idk what is the problem

when a put:
hfs dfs -put /Users/mariajesuscanoles/Desktop/test/word.txt /Users/mariajesuscanoles
I have this error, and idk how fix it:
2022-07-03 14:39:16,022 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
put: `/Users/mariajesuscanoles': No such file or directory: `hdfs://localhost:8020/Users/mariajesuscanoles'
And I'm using mac
HDFS doesn't have a default /Users folder like a Mac.
It will only have the directories that you've created with hdfs mkdir
Hadoop prefers you use /user/ (singular, lowercase), anyway
/Users/mariajesuscanoles': No such file or directory:
It is just telling /Users directory doesn't exits.
I suppose your assumption is that there is a user home directory named '/User/mariajesuscanoles' already there in HDFS which isn't true.
Do a hdfs dfs -mkdir -p /User/mariajesuscanoles and it should create this directory and then you can try your put command.
BTW. the default prefix for User home directory in HDFS is /user not /User, If you want to change it you can do so by changing the value of config: dfs.user.home.dir.prefix

why hdfs dfs commands are stuck?

I have recently installed hadoop 3.2.0 on Linux and trying to issue commands like hdfs dfs -ls /. But I don't see any output at all, it seems like stuck somewhere.
I also get the warning message:
'Unable to load native-hadoop library for your platform... using builtin-java classes where applicable'
But that is all. Please let me know how to resolve this.
$hadoop fs -ls /
no output, command stuck

Unable to create a directory on hdfs on mac os

I am getting the below error message when I try to create a directory on hdfs.
I installed all the required softwares ssh, Java and set all the environment variables.
Not really sure where am I going wrong.
Could anyone share your thoughts on this? Thanks.
Command used:
bin/hdfs dfs -mkdir /Users/ravitejavutukuri/input
Error:
18/06/30 22:56:11 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
mkdir: `/Users/ravitejavutukuri/input': No such file or directory
Currently I installed Hadoop 2.9.1 and I'm trying to experiment with pseudo-distributed-mode.
Try this command. It will create all the directories in the path.
bin/hdfs dfs -mkdir -p /Users/ravitejavutukuri/input
HDFS has no /Users directory (its not a Mac equivalent structure)
Did you mean /user?
The correct way to make a user directory for yourself would be
hdfs dfs -mkdir -p /user/$whoami/
hdfs dfs -chmod -R 750 /user/$whoami/
Then to make an input directory, not giving an absolute path automatically uses your HDFS user folder
hdfs dfs -mkdir input/

Hadoop is not starting any datanode

I've configured hadoop-2.2.0 on Ubuntu/linux but when I tried to run it via start-dfs.sh and start-yarn it gave me this error:
Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
And when I go to localhost:50070/nn_browsedfscontent.jsp then it gives me the following error:
Can't browse the DFS since there are no live nodes available to redirect to.
So I followed this link to build hadoop from source but the problem still persists. Help needed!
Try hadoop-daemon.sh start namenode and then hadoop-daemon.sh start datanode and check your browser at localhost:50070

Resources