I am learning the command line and trying to customize my environment to have Powerline functionality.
I have installed the Powerline file from GitHub and am trying to add to my path variable.
My .bash_profile is set as follows:
export PATH=$PATH:$HOME/Library/Python/2.7/bin
export PATH=$PATH:$HOME/Library/Python/2.7/binpowerline-daemon -qPOWERLINE_BASH_CONTINUATION=1POWERLINE_BASH_SELECT=1. /Users/johnmyers/Library/Python/2.7/lib/python/site-packages/powerline/bindings/bash/powerline.sh
I receive the following error messages when launching terminal.
-bash: export: `-qPOWERLINE_BASH_CONTINUATION=1POWERLINE_BASH_SELECT=1': not a valid
identifier
-bash: export: `/Users/johnmyers/Library/Python/2.7/lib/python/site-packages/powerline/bindings/bash/powerline.sh':
not a valid identifier
I would appreciate any guidance in the right direction on this issue. Thank you.
The stuff you added needs to be broken up on individual lines with newlines between them. I can only guess where the newlines are supposed to go, but something like
export PATH=$PATH:$HOME/Library/Python/2.7/bin
# No need to repeat this
# export PATH=$PATH:$HOME/Library/Python/2.7/bin
powerline-daemon -q
POWERLINE_BASH_CONTINUATION=1
POWERLINE_BASH_SELECT=1
. /Users/johnmyers/Library/Python/2.7/lib/python/site-packages/powerline/bindings/bash/powerline.sh
When I Open my terminal, it showing this line.
bash: export: `/home/mohin/.bashrc': not a valid identifier
mohin#mohin:~$
I am using Ubuntu 16.04.
I used this line of commend after that I am facing this trouble
echo 'export PATH="$HOME/.composer/vendor/bin:$PATH"' >> ~/.bashrc
source ~/.bashrc
I got this line from MEDIUM Post at the following link
https://medium.com/#rgdev/how-to-install-laravel-5-4-on-ubuntu-16-04-from-scratch-quickly-29375e18e7ca
I need to solve it. Please help me.
I just fix this issue by the following process.
I did this commend:
mohin#mohin:~$ grep -i export ~/.bashrc
After That, It shows the following info
#export GCC_COLORS='error=01;31:warning=01;35:note=01;36:caret=01;32:locus=01:quote=01'
export PATH="$HOME/.composer/vendor/bin:$PATH"
export PATH="$HOME/.composer/vendor/bin:$PATH"
export PATH="$HOME/.composer/vendor/bin:$PATH" source /home/mohin/.bashrc
export PATH="$HOME/.composer/vendor/bin:$PATH"
export PATH="$PATH:$HOME/.composer/vendor/bin"
export PATH="$PATH:$HOME/.config/composer/vendor/bin"
over here I got the issue. I mean this line -
export PATH="$HOME/.composer/vendor/bin:$PATH" source
/home/mohin/.bashrc
I open .bashrc and remove this line by the following command -
sudo nano /home/mohin/.bashrc
That's it. issue fix.
echo $JAVA_HOME
gives me/usr/lib/jvm/java-8-oracle
and I have export JAVA_HOME= /usr/lib/jvm/java-8-oracle in my /usr/local/hadoop/etc/hadoop/hadoop-env.sh.
However when I run /usr/local/hadoop/bin/hadoop I got the following error:
/usr/local/hadoop/etc/hadoop/hadoop-env.sh: line 25: export: `/usr/lib/jvm/java-8-oracle': not a valid identifier
Error: JAVA_HOME is not set and could not be found.
I thought I had my JAVA_HOME correctly set, anyone can tell me where I did wrong?
Thanks.
Remove that space between the equals sign and the path in your export. Shell scripting can be a little finicky about that kind of thing.
This Must happened for Multiple JAVA_HOME Defined please check it in .bashrc file or .profile
I understand this question might have been answered already, well, my issue is still here:
I have a vm created for hadoop on vmware using CentOS7, I can start namenode and datanode, however, when I tried to view hdfs file using the following command:
hdfs dfs -ls
it throws out an error below:
Could not find or load main class org.apache.hadoop.fs.FsShell
My google searchings suggest this might relate to hadoop variables setting in bash, here is my settings:
# .bashrc
# Source global definitions
if [ -f /etc/bashrc ]; then
. /etc/bashrc
fi
export HADOOP_HOME=/opt/hadoop/hadoop-2.7.2
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export HADOOP_YARN_HOME=$HADOOP_HOME
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export HADOOP_PREFIX=$HADOOP_HOME
export HIVE_HOME=/opt/hadoop/hive
export PATH=$HIVE_HOME/bin:$PATH
export ANT_HOME=/usr/local/apache-ant-1.9.7
export PATH=${PATH}:${JAVA_HOME}/bin
export PIG_HOME=/opt/hadoop/pig-0.15.0
export PIG_HADOOP_VERSION=0.15.0
export PIG_CLASSPATH=$HADOOP_HOME/etc/hadoop
export PATH=$PATH:$PIG_HOME/bin
export PATH=$PATH:$HADOOP_HOME/bin
export HADOOP_USER_CLASSPATH_FIRST=true
export SQOOP_HOME=/usr/lib/sqoop
export PATH=$PATH:$SQOOP_HOME/bin
export HADOOP_CLASSPATH=$HADOOP_HOME/share/hadoop/common/
export PATH=$PATH:$HADOOP_CLASSPATH
# Uncomment the following line if you don't like systemctl's auto-paging feature
:
# export SYSTEMD_PAGER=
# User specific aliases and functions
I checked my hadoop folder: /opt/hadoop/hadoop-2.7.2/share/hadoop/common, here is the list:
I am doing this practice using root account, can anyone help to find out where is the cause of this issue and fix it? Thank you very much.
this typically happens when you have multiple instances of hadoop, check which hadoop and see if its pointing out to the version that you have installed.
say if it points to /usr/bin/hadoop and not /your-path/hadoop, then you can point /usr/bin/hadoop to that (with symlink)
As Filmon Gebreyesus pointed out this can happen when you have multiple hadoop instances. First, check what you have in $PATH. There should be paths to the hadoop/bin .
If it still not working run whereis hdfs . Check the output. If there is an hdfs which should not be there, remove/move it.
Try to unset $HADOOP_COMMON_HOME
unset HADOOP_COMMON_HOME
I'm working with Ubuntu 12.04 LTS.
I'm going through the hadoop quickstart manual to make a pseudo-distributed operation. It seems simple and straightforward (easy!).
However, when I try to run start-all.sh I get:
localhost: Error: JAVA_HOME is not set.
I've read all the other advice on stackoverflow for this issue and have done the following to ensure JAVA_HOME is set:
In /etc/hadoop/conf/hadoop-env.sh I have set
JAVA_HOME=/usr/lib/jvm/java-6-oracle
export JAVA_HOME
In /etc/bash.bashrc I have set
JAVA_HOME=/usr/lib/jvm/java-6-oracle
export JAVA_HOME
PATH=$PATH:$JAVA_HOME/bin
export PATH
which java returns:
/usr/bin/java
java –version works
echo $JAVA_HOME returns:
/usr/lib/jvm/java-6-oracle
I've even tried becoming root and explicitly writing the in the terminal:
$ JAVA_HOME=/usr/lib/jvm/java-6-oracle
$ export JAVA_HOME
$ start-all.sh
If you could show me how to resolve this error it would be greatly appreciated.
I'm thinking that my JAVA_HOME is being overridden somehow. If that is the case, could you explain to me how to make my exports global?
I am using hadoop 1.1, and faced the same problem.
I got it solved through changing JAVA_HOME variable in /etc/hadoop/hadoop-env.sh as:
export JAVA_HOME=/usr/lib/jvm/<jdk folder>
The way to solve this problem is to export the JAVA_HOME variable inside the conf/hadoop-env.sh file.
It doesn't matter if you already exported that variable in ~/.bashrc, it'll still show the error.
So edit conf/hadoop-env.sh and uncomment the line "export JAVA_HOME" and add a proper filesystem path to it, i.e. the path to your Java JDK.
# The Java implementation to use. Required.
export JAVA_HOME="/path/to/java/JDK/"
Ran into the same issue on ubuntu LTS 16.04. Running bash -vx ./bin/hadoop showed it tested whether java was a directory. So I changed JAVA_HOME to a folder and it worked.
++ [[ ! -d /usr/bin/java ]]
++ hadoop_error 'ERROR: JAVA_HOME /usr/bin/java does not exist.'
++ echo 'ERROR: JAVA_HOME /usr/bin/java does not exist.'
ERROR: JAVA_HOME /usr/bin/java does not exist.
So I changed JAVA_HOME in ./etc/hadoop/hadoop-env.sh to
export JAVA_HOME=/usr/lib/jvm/java-8-oracle/jre/
and hadoop starts fine.
The way to debug this is to put an "echo $JAVA_HOME" in start-all.sh. Are you running your hadoop environment under a different username, or as yourself? If the former, it's very likely that the JAVA_HOME environment variable is not set for that user.
The other potential problem is that you have specified JAVA_HOME incorrectly, and the value that you have provided doesn't point to a JDK/JRE. Note that "which java" and "java -version" will both work, even if JAVA_HOME is set incorrectly.
extract from etc/hadoop/hadoop-env.sh
The only required environment variable is JAVA_HOME. All others are
optional. When running a distributed configuration it is best to
set JAVA_HOME in this file, so that it is correctly defined on
remote nodes.
This means its better and advised to set JAVA_HOME here.. even though the existing definition reads the JAVA_HOME variable. Perhaps its not getting the value of JAVA_HOME from previously set value... standard apache manual does not tell this :( :(
This error is coming from Line 180
if [[ -z $JAVA_HOME ]]; then
echo "Error: JAVA_HOME is not set and could not be found." 1>&2
exit 1
fi
in libexec/hadoop-config.sh.
Try echo $JAVA_HOME in that script. If it doesn't recognize,
Find your JAVA_HOME using this:
$(readlink -f /usr/bin/javac | sed "s:/bin/javac::")
and replace the line
export JAVA_HOME=${JAVA_HOME}
in /etc/hadoop/hadoop-env.sh with JAVA_HOME you got from above command.
I also had faced the similar problem in hadoop 1.1
I had not noticed that the JAVA_HOME was commented in: hadoop/conf/hadoop-env.sh
It was
/#JAVA_HOME=/usr/lib/jvm/java-6-oracle
Had to change it to
JAVA_HOME=/usr/lib/jvm/java-6-oracle
regardless of debian or any linux flavor, just know that ~/.bash_profile belongs to specific user and is not system wide.
in pseudo-distributed environment hadoop works on localhost so the $JAVA_HOME in .bash_profile is no use anymore.
just export the JAVA_HOME in ~/.bashrc and use it system wide.
Check if your alternatives is pointing to the right one, you might actually be pointing to a different version and trying to alter the hadoop-env.sh on another installed version.
-alternatives --install /etc/hadoop/conf [generic_name] [your correct path] priority {for further check man page of alternatives}
to set alternatives manually,
alternatives --set [generic name] [your current path].
Change the JAVA_HOME variable in conf/hadoop-env.sh
export JAVA_HOME=/etc/local/java/<jdk folder>
echo "export JAVA_HOME=/usr/lib/java" >> $HADOOP_HOME/etc/hadoop/hadoop-env.sh
Notice: Do not use export JAVA_HOME=${JAVA_HOME} !
I put it on the first line of file ~/.bashrc, then it works well!
export JAVA_HOME=/usr/lib/jvm/default-java