This question already has answers here:
Variable in Bash Script that keeps it value from the last time running
(3 answers)
bash—Better way to store variable between runs?
(7 answers)
How to use Unix variables to set and retain values across session {bash} [duplicate]
(1 answer)
Closed 9 months ago.
I'm writing a script to toggle a program and need a way to make sh remember a variable after the script has executed and terminated. The only way I can think is by writing a daemon, but there must be a simpler way.
The code works when run in a persistent session, but cannot work as I intend it; the exported variables are deleted when the script finishes running.
I need a toggle as I'm planning to bind the script to a key to toggle japanese and english input, and need to switch between them.
Here's my code:
#!/bin/sh
export toggle=0
if [ $toggle = 0 ];
then
test -z $(pgrep wlanthy) && wlanthy & disown;
export toggle=1;
elif [ $toggle = 1 ];
then
test $(pgrep wlanthy) && killall wlanthy
export toggle=0
else echo error
fi
the problem was solved simply by doing this. I was overthinking it:
#!/bin/sh
if test -z $(pgrep wlanthy);
then
wlanthy & disown;
exit
elif test $(pgrep wlanthy);
then
killall wlanthy
exit
else echo error
fi
This script is bound to a single key that toggles an IME (an input method engine, which turns english keyboard text into japanese.) , meaning the script needs to both start and stop the program, hence its behaviour.
For example, press the key -> type some japanese -> press the key again -> type some english.
I am writing a simple script that export variables based on condition. But after running the script none of the variables are accessible. The code is as follows:
#!/bin/bash
if [[ $1 == 11 ]]; then
echo "Loading java 11"
export JAVA_HOME="/Library/Java/JavaVirtualMachines/jdk-11.0.11.jdk/Contents/Home"
else
echo "Loading java 8"
export JAVA_HOME="/Library/Java/JavaVirtualMachines/jdk1.8.0_201.jdk/Contents/Home"
fi
I am running with ./file.sh 11 and bash file.sh 11 but both echo Loading 11 but does not load.
You need to use source the file with source file.sh 11 or . file.sh 11 instead. Then the shell commands in script will be executed in the current shell as if typed from the command line. Else with bash a new session is created and your commands are run within that. So variables are not accessible in your session.
Also, you need to
export $JAVA_HOME="/Library/Java/JavaVirtualMachines/jdk1.8.0_201.jdk/Contents/Home"
to
Removing the $ sign
export JAVA_HOME="/Library/Java/JavaVirtualMachines/jdk1.8.0_201.jdk/Contents/Home"
This question already has answers here:
Are shell scripts sensitive to encoding and line endings?
(14 answers)
Closed 1 year ago.
I am trying to set environment variables in a bash script to be read by another bash script, but they are not getting set properly. I am on Ubuntu 20.04.
setting environment variables in a script:
setenv.env
export DB1_IMAGE="postgres:latest"
run it: . setenv.env
test it: echo $DB1_IMAGE
result: postgres:latest
script to test the environment variable value:
test.sh
#!/bin/bash
echo $DB1_IMAGE
if [[ $DB1_IMAGE == "postgres:latest" ]]
then
echo "equals"
else
echo "not equals"
fi
run the test script: . test.sh
result:
postgres:latest
not equals
now set the environment variable with command line:
export DB1_IMAGE="postgres:latest"
now run the test script again: . test.sh
result:
postgres:latest
equals
Summary: When an environment variable is set with a bash script, that value will fail an equals comparison in another bash script. When that same environment variable is set with a command line, it passes the equals test. I can't explain why this is. I feel like I'm missing something obvious. How could the == test fail? Are there unprintable characters being inserted somehow? Please help..
Thanks to #glennjackman, the cause of this was that the bash script file (setenv.env) was DOS-formatted as opposed to UNIX-formatted. This means it had \r\n line breaks, which cause hidden characters to be inserted into the environment variables. The fix is to run dos2unix on the file (sudo apt install dos2unix)
This question already has answers here:
is it possible to use variables in remote ssh command?
(2 answers)
Closed 4 years ago.
in a bash script i try to do:
ssh -n $username#server2 "rm ${delete_file}"
but always get the error:
rm: missing operand
when I
> echo $delete_file
> /var/www/site/myfile.txt
I get the correct path.
What am i doing wrong?
Could it be that in your case, $delete_file is set on the remote host and not on your current machine?
If you want $delete_file to be expanded on the remote side (i.e., after ssh'ing into server2), you have to use single quotes:
ssh -n $username#server2 'rm ${delete_file}'
Other than that, do you set the value of delete_file in the same script (before ssh'ing), or before invoking your script? If latter is the case, it can't work: Variables are not propagated to scripts called by the current script/session.
You could do the following about it:
delete_file=<your-value> ./ssh-script
or:
delete_file=<your-value>
export delete_file
./ssh-script
As it turns out this last option was the problem, let me elaborate on best practices:
Better than setting environment variables would be the usage of positional parameters.
#!/bin/bash
# $1: file to delete
delete_file=${1:?Missing parameter: which file for deletion?}
ssh -n $username#server2 "rm ${delete_file}"
Usage of the script is now as simple as:
./ssh-script <your-file-for-deletion>
This way, you don't have to remember which variable is exactly expected by the script when calling it - simply call the script with a positional parameter.
As a bonus, the example uses parameter expansion to check for not-set or empty parameters:
delete_file=${1:?Missing parameter: which file for deletion?}
Whenever $1 happens to be unset or empty, the scripts exits immediately with exit code 1 and prints given message to stderr.
Recently I wrote a script which sets an environment variable, take a look:
#!/bin/bash
echo "Pass a path:"
read path
echo $path
defaultPath=/home/$(whoami)/Desktop
if [ -n "$path" ]; then
export my_var=$path
else
echo "Path is empty! Exporting default path ..."
export my_var=$defaultPath
fi
echo "Exported path: $my_var"
It works just great but the problem is that my_var is available just locally, I mean in console window where I ran the script.
How to write a script which allow me to export global environment variable which can be seen everywhere?
Just run your shell script preceded by "." (dot space).
This causes the script to run the instructions in the original shell. Thus the variables still exist after the script finish
Ex:
cat setmyvar.sh
export myvar=exists
. ./setmyvar.sh
echo $myvar
exists
Each and every shell has its own environment. There's no Universal environment that will magically appear in all console windows. An environment variable created in one shell cannot be accessed in another shell.
It's even more restrictive. If one shell spawns a subshell, that subshell has access to the parent's environment variables, but if that subshell creates an environment variable, it's not accessible in the parent shell.
If all of your shells need access to the same set of variables, you can create a startup file that will set them for you. This is done in BASH via the $HOME/.bash_profile file (or through $HOME/.profile if $HOME/.bash_profile doesn't exist) or through $HOME/.bashrc. Other shells have their own set of startup files. One is used for logins, and one is used for shells spawned without logins (and, as with bash, a third for non-interactive shells). See the manpage to learn exactly what startup scripts are used and what order they're executed).
You can try using shared memory, but I believe that only works while processes are running, so even if you figured out a way to set a piece of shared memory, it would go away as soon as that command is finished. (I've rarely used shared memory except for named pipes). Otherwise, there's really no way to set an environment variable in one shell and have another shell automatically pick it up. You can try using named pipes or writing that environment variable to a file for other shells to pick it up.
Imagine the problems that could happen if someone could change the environment of one shell without my knowledge.
Actually I found an way to achieve this (which in my case was to use a bash script to set a number of security credentials)
I just call bash from inside the script and the spawned shell now has the export values
export API_USERNAME=abc
export API_PASSWORD=bbbb
bash
now calling the file using ~/.app-x-setup.sh will give me an interactive shell with those environment values setup
The following were extracted from 2nd paragraph from David W.'s answer: "If one shell spawns a subshell, that subshell has access to the parent's environment variables, but if that subshell creates an environment variable, it's not accessible in the parent shell."
In case a user need to let parent shell access your new environment variables, just issue the following command in parent shell:
source <your_subshell_script>
or using shortcut
. <your_subshell_script>
You got to add the variable in your .profile located in /home/$USER/.profile
Yo can do that with this command:
echo 'TEST="hi"' >> $HOME/.profile
Or by edit the file with emacs, for example.
If you want to set this variable for all users, you got to edit /etc/profile (root)
There is no global environment, really, in UNIX.
Each process has an environment, originally inherited from the parent, but it is local to the process after the initial creation.
You can only modify your own, unless you go digging around in the process using a debugger.
write it to a temporary file, lets say ~/.myglobalvar and read it from anywhere
echo "$myglobal" > ~/.myglobalvar
Environment variables are always "local" to process execution the export command allow to set environment variables for sub processes. You can look at .bashrc to set environment variables at the start of a bash shell. What you are trying to do seems not possible as a process cannot modify (or access ?) to environment variables of another process.
You can update the ~/.bashrc or ~/.bash_profile file which is used to initialize the environment.
Take a look at the loading behavior of your shell (explained in the manpage, usually referring to .XXXshrc or .profile). Some configuration files are loaded at login time of an interactive shell, some are loaded each time you run a shell. Placing your variable in the latter might result in the behavior you want, e.g. always having the variable set using that distinct shell (for example bash).
If you need to dynamically set and reference environment variables in shell scripts, there is a work around. Judge for yourself whether is worth doing, but here it is.
The strategy involves having a 'set' script which dynamically writes a 'load' script, which has code to set and export an environment variable. The 'load' script is then executed periodically by other scripts which need to reference the variable. BTW, the same strategy could be done by writing and reading a file instead of a variable.
Here's a quick example...
Set_Load_PROCESSING_SIGNAL.sh
#!/bin/bash
PROCESSING_SIGNAL_SCRIPT=./Load_PROCESSING_SIGNAL.sh
echo "#!/bin/bash" > $PROCESSING_SIGNAL_SCRIPT
echo "export PROCESSING_SIGNAL=$1" >> $PROCESSING_SIGNAL_SCRIPT
chmod ug+rwx $PROCESSING_SIGNAL_SCRIPT
Load_PROCESSING_SIGNAL.sh (this gets dynamically created when the above is run)
#!/bin/bash
export PROCESSING_SIGNAL=1
You can test this with
Test_PROCESSING_SIGNAL.sh
#!/bin/bash
PROCESSING_SIGNAL_SCRIPT=./Load_PROCESSING_SIGNAL.sh
N=1
LIM=100
while [ $N -le $LIM ]
do
# DO WHATEVER LOOP PROCESSING IS NEEDED
echo "N = $N"
sleep 5
N=$(( $N + 1 ))
# CHECK PROCESSING_SIGNAL
source $PROCESSING_SIGNAL_SCRIPT
if [[ $PROCESSING_SIGNAL -eq 0 ]]; then
# Write log info indicating that the signal to stop processing was detected
# Write out all relevent info
# Send an alert email of this too
# Then exit
echo "Detected PROCESSING_SIGNAL for all stop. Exiting..."
exit 1
fi
done
~/.bin/SOURCED/lazy script to save and load data as flat files for system.
[ ! -d ~/.megadata ] && mkdir ~/.megadata
function save_data {
[ -z "$1" -o -z "$2" ] && echo 'save_data [:id:] [:data:]' && return
local overwrite=${3-false}
[ "$overwrite" = 'true' ] && echo "$2" > ~/.megadata/$1 && return
[ ! -f ~/.megadata/$1 ] && echo "$2" > ~/.megadata/$1 || echo ID TAKEN set third param to true to overwrite
}
save_data computer engine
cat ~/.megadata/computer
save_data computer engine
save_data computer megaengine true
function get_data {
[ -z "$1" -o -f $1 ] && echo 'get_data [:id:]' && return
[ -f ~/.megadata/$1 ] && cat ~/.megadata/$1 || echo ID NOT FOUND
:
}
get_data computer
get_data computer
Maybe a little off topic, but when you really need it to set it temporarily to execute some script and ended up here looking for answers:
If you need to run a script with certain environment variables that you don't need to keep after execution you could do something like this:
#!/usr/bin/env sh
export XDEBUG_SESSION=$(hostname);echo "running with xdebug: $XDEBUG_SESSION";$#
In my example I just use XDEBUG_SESSION with a hostname, but you can use multiple variables. Keep them separated with a semi-colon. Execution as follows (assuming you called the script debug.sh and placed it in the same directory as your php script):
$ debug.sh php yourscript.php