I am using centos 7. Downloaded and untarred hadoop 2.4.0 and followed the instruction as per the link Hadoop 2.4.0 setup
Ran the following command.
./hdfs namenode -format
Got this error :
Error: Could not find or load main class org.apache.hadoop.hdfs.server.namenode.NameNode
I see a number of posts with the same error with no accepted answers and I have tried them all without any luck.
This error can occur if the necessary jarfiles are not readable by the user running the "./hdfs" command or are misplaced so that they can't be found by hadoop/libexec/hadoop-config.sh.
Check the permissions on the jarfiles under: hadoop-install/share/hadoop/*:
ls -l share/hadoop/*/*.jar
and if necessary, chmod them as the owner of the respective files to ensure they're readable. Something like chmod 644 should be sufficient to at least check if that fixes the initial problem. For the more permanent fix, you'll likely want to run the hadoop commands as the same user that owns all the files.
I followed the link Setup hadoop 2.4.0
and I was able to get over the error message.
Seems like the documentation on hadoop site is not complete.
Related
I am new to Hadoop and am trying to execute the WordCount Problem.
Things I did so far -
Setting up the Hadoop Single Node cluster referring the below link.
http://www.bogotobogo.com/Hadoop/BigData_hadoop_Install_on_ubuntu_single_node_cluster.php
Write the word count problem referring the below link
https://kishorer.in/2014/10/22/running-a-wordcount-mapreduce-example-in-hadoop-2-4-1-single-node-cluster-in-ubuntu-14-04-64-bit/
Problem is when I execute the last line to run the program -
hadoop jar wordcount.jar /usr/local/hadoop/input /usr/local/hadoop/output
Following is the error I get -
The directory seems to be present
The file is also present in the directory with contents
Finally, on a side note I also tried the following directory sturcture in the jar command.
No avail! :/
I would really appreciate if someone could guide me here!
Regards,
Paul Alwin
Your first image is using input from the local Hadoop installation directory, /usr
If you want to use that data on your local filesystem, you can specify file:///usr/...
Otherwise, if you're running pseudo distributed mode, HDFS has been setup, and /usr does not exist in HDFS unless you explicitly created it there.
Based on the stacktrace, I believe the error comes from the /app/hadoop/ staging directory path not existing, or the permissions for it are not allowing your current user to run commands against that path
Suggestion: Hortonworks and Cloudera offer pre-built VirtualBox images and lots of tutorial resources. Most companies will have Hadoop from one of those vendors, so it's better to get familiar with that rather than mess around with having to install Hadoop yourself from scratch, in my opinion
I've installed HDP 2.2.0 on a 3 machine cluster running CentOS6 with the help of Ambari, and got no errors during the install process. I then installed the Hive view, as described here, and the necessary Tez view, but whenever I try to use the view for a query (even a simple SHOW TABLES;), I get this error:
F080 Error in creation /user/zenuser/hive/jobs/hive-job-7-2015-07-15_10-32...
I've found nothing thus far, and I don't know where more precise logs might be stored. Any ideas?
The problem came from file permissions. One easy step in resolving this would be to install first the HDFS view, and make sure the Ambari user you're using has all the necessary privileges to write in HDFS.
For more info, read this link.
Run the following as the super user (hdfs):
bin/hadoop dfs -mkdir /user/zenuser/
bin/hadoop dfs -chown zenuser /user/zenuser/
You need to set the property hive.server2.enable.doAs to false. Refer the blog for more details
i have installed hadoop single node on ubuntu 12.04. Now I am trying to install hbase over it (version 0.94.18). But i get the following errors(even though i have extracted it in the /usr/local/hbase):
Error: Could not find or load main class org.apache.hadoop.hbase.util.HBaseConfTool
Error: Could not find or load main class org.apache.hadoop.hbase.zookeeper.ZKServerTool
starting master, logging to /usr/lib/hbase/hbase-0.94.8/logs/hbase-hduser-master-ubuntu.out
nice: /usr/lib/hbase/hbase-0.94.8/bin/hbase: No such file or directory
cat: /usr/lib/hbase/hbase-0.94.8/conf/regionservers: No such file or directory
To resolve This Error
Download binary version of hbase
Edit conf file hbase-env.sh and hbase-site.xml
Set Up Hbase Home Directory
Start hbase By - Start-hbase.sh
Explanation To above Error:
Could not find or load main class your downloaded version does not have required jar
Hi can you tell when it is coming this error.
I think you gave environment set wrong
You should enter bellow command:
export HBASE_HOME="/usr/lib/hbase/hbase-0.94.18"
Then try hbase it will work.
If you want shell script you can download this lik :: https://github.com/tonyreddy/Apache-Hadoop1.2.1-SingleNode-installation-shellscript
It have hadoop, hive, hbase, pig.
Thank
Tony.
It is not recommended to run hbase from the source distribution directly instead you have to download the binary distribution as they have mentioned in their official site, follow the same instructions and you will get it up.
You could try installing the version 0.94.27
Download it from : h-base 0.94.27 dowload
This one worked for me.
Follow the instruction specified in :
Hbase installation guide
sed "s/<\/configuration>/<property>\n<name>hbase.rootdir<\/name>\n<value>hdfs:\/\/'$c':54310\/hbase<\/value>\n<\/property>\n<property>\n<name>hbase.cluster.distributed<\/name>\n<value>true<\/value>\n<\/property>\n<property>\n<name>hbase.zookeeper.property.clientPort<\/name>\n<value>2181<\/value>\n<\/property>\n<property>\n<name>hbase.zookeeper.quorum<\/name>\n<value>'$c'<\/value>\n<\/property>\n<\/configuration>/g" -i.bak hbase/conf/hbase-site.xml
sed 's/localhost/'$c'/g' hbase/conf/regionservers -i
sed 's/#\ export\ HBASE_MANAGES_ZK=true/export\ HBASE_MANAGES_ZK=true/g' hbase/conf/hbase-env.sh -i
Yes just type this tree commands and you need change replace $c to your hostname.
Then try it will work.
I am new to Hadoop. I am trying to set up Hadoop 2.4 on MacBook Pro using Homebrew. I have been following instructions in this web site (http://shayanmasood.com/blog/how-to-setup-hadoop-on-mac-os-x-10-9-mavericks/). I have installed Hadoop on my machine. Now I am trying to configure Hadoop.
One needs to configure the following files according to the website.
mapred-site.xml
hdfs-site.xml
core-site.xml
hadoop-env.sh
But, it seems that this information is a bit old. In Terminal, I see the following.
In Hadoop's config file:
/usr/local/Cellar/hadoop/2.4.0/libexec/etc/hadoop/hadoop-env.sh,
/usr/local/Cellar/hadoop/2.4.0/libexec/etc/hadoop/mapred-env.sh and
/usr/local/Cellar/hadoop/2.4.0/libexec/etc/hadoop/yarn-env.sh
$JAVA_HOME has been set to be the output of:
/usr/libexec/java_home
It seems that I have three files to configure here. Am I right on the track? There is information for hadoop-env.sh and mapped-env.sh for configuration. But, I have not seen one for yarn-env.sh. What do I have to do with this file?
The other question is how I access to these files for modification? I receive the following message in terminal right now.
-bash: /usr/local/Cellar/hadoop/2.4.0/libexec/etc/hadoop/hadoop-env.sh: Permission denied
If you have any suggestions, please let me know. Thank you very much for taking your time.
You can find the configuration files under :
/usr/local/Cellar/hadoop/2.4.0/libexec/etc/hadoop
And concerning the permission for the scripts suggested by brew, you also need to change their mode.
In the scripts directory (/usr/local/Cellar/hadoop/2.4.0/libexec/etc/hadoop/)
sudo chmod +x *.sh
You are checking in hadoop/conf/ folder to amend below
mapred-site.xml,hdfs-site.xml,core-site.xml
And you can change permission of hadoop-env.sh to make changes to that.
Make sure that your session is in SSH. Then use the start-all.sh command to start Hadoop.
I just installed Hadoop single node but when i run it by logging on localhost it gives error that it cannot make changes to files as permission is denied?
Have you followed all the steps as suggested in: http://hadoop.apache.org/common/docs/current/single_node_setup.html ?
You may want to look at this : http://getsatisfaction.com/cloudera/topics/permission_denied_error_in_desktop
Also, some more information would definitely help.
You have not given necessary permissions.Make a different user other than root.Follow this tutorial to the point http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/
It seems to be missing permissions for the user on the directory containing the files
make sure that the user you are logged on , is the owner of the Hadoop directory by running
ls -la command
if not the owner run the command chown -R hadoop user:group hadoop directory and it will work fine.
also you can follow the tutorial of michael noll
http://www.michael-noll.com/tutorials/running-hadoop-on-ubuntu-linux-single-node-cluster/