Why exporting env var in bash script does not affect env? - bash

I want to set env variable in shell script. Shell script content is:
#!/bin/bash
export XDEBUG_CONFIG="idekey=PHPSTORM"
I tried both bash bin/enable_debug and bin/enable_debug. After both command I get:
$ echo $XDEBUG_CONFIG
$
However if I run export XDEBUG_CONFIG="idekey=PHPSTORM" directly in cli it works. What's wrong with my method?

You can try running your script as below:
. bin/enable_debug
OR
source bin/enable_debug
as indicated by #Aserre

Related

Why Can't I Set Env Variables By Running A BASH Script From An Npm Script?

I have a nodejs javascript project, but I would like to set a bunch of environment variables locally. created a bash file that just exports some variables:
#!/usr/bin/env bash
export waka=flaka
export fat=booty
When I use the dot to source and run the file from the command line it works fine:
. ./env.sh
And I can see the variable has been set
echo $waka # prints "flaka"
But then I try to take this command and make it an npm script by adding it to my package.json
scripts: {
"set-env": ". ./env.sh",
...
}
and then run it:
npm run set-env
The script is run but the environment variables are not saved:
echo $waka # prints undefined (assuming you didn't already run it from command line)
So, I'm wondering why it doesn't save the envrionment variables as an npm script and if it's possible to run the bash script from an npm script in a way such that the environment variables will be saved for the rest of the command prompt session. Thanks!
npm is not a shell command; it runs in a separate process that forks another shell in order to run the command specified by set-env. env.sh is executed, but then that shell immediately exits, at which point the changes are gone (and then npm itself exits).

set docker-machine variables using a bash script

I have a script like so:
#!/usr/bin/env bash
eval $(docker-machine env default)
The goal is to automate the setting of variables like
export DOCKER_TLS_VERIFY
export DOCKER_HOST
export DOCKER_CERT_PATH
export DOCKER_MACHINE_NAME
But when I check afterwards, the variables are not set. This is not the case if I run each export command manually. What am I doing wrong?
export makes variables available only to the active shell session. If you want them to persist through sessions, you must add them to your bash profile:
docker-machine env default >> ~/.bash_profile
This way, the variables will be available in all future shell sessions. Make sure to restart the shell after executing the command.
If you want the environment set in your current shell you need to "source" the script rather than run it.
When you run a script, the variables will be set in the child bash processes environment and will not exist once that script/process dies.
$ ./machine.sh
DOCKER_HOST is tcp://192.168.99.100:2376
$ echo "[$DOCKER_HOST]"
[]
When you source a script, the variables will be set in your current environment
$ . machine.sh
DOCKER_HOST is tcp://192.168.99.100:2376
$ echo "[$DOCKER_HOST]"
[tcp://192.168.99.100:2376]

How to run .profile inside a shell script in ubuntu

I am running a script which is echoing the environment variables in .profile file and than I am running the script, but I am getting following error
I tried following:
node#node-virtual-machine:~$ cat env.sh
#!/bin/bash
echo 'export JAVA_HOME=/home/node/jdk1.6.0_45' >> /home/node/.profile
echo 'export PATH=$PATH:$JAVA_HOME/bin' >> /home/node/.profile
cd /home/node/
source .profile
node#node-virtual-machine:~$ sh env.sh
sudo: source: command not found
How to execute .profile within a script?
Instead of:
sh env.sh
You should run:
bash ./env.sh
Besides instead of:
source .profile
use:
source ~/.profile
If .profile exists as /home/node/.profile, you have all that is necessary.
Your script says /bin/bash at the top but you run it with sh which is probably dash on your system. You probably are already running Bash at the prompt, so you should say source env.sh instead of sh env.sh if you want the variables to be exposed to your terminal.

Setting variables with executing a command with exec

You can set a variable for a single command like this:
MY_VARIABLE=my_value ./my_script.sh
You can hand off to another script like this:
exec ./my_script.sh
But when I try to do both like this:
exec MY_VARIABLE=my_value ./my_script.sh
I get an error:
exec: MY_VARIABLE=my_value: not found
Why is that?
Is there any way to do this?
You need to use env to specify the environment variable:
exec env MY_VARIABLE=my_value ./my_script.sh
If you want your script to start with an empty environment or with only the specified variables, use the -i option.
From man env:
env - run a program in a modified environment
In bash, you can set environment variables for a command by putting the assignments at the beginning of the command. This works the same for exec as any other command, so you write:
MYVARIABLE=my_value exec ./my_script.sh

How to force ssh to execute bash instead of the user default on the remote machine?

I want to execute a bash script with ssh but when I try this it's using ksh which is the user's default shell.
I can't change that default.
So, how can I trick ssh to execute my script with bash instead of the default shell?
Make this the first line of your script:
#!/usr/bin/env bash
Edit: As per this, the utility of /usr/bin/env is dubious. So, you probably want:
#!/bin/bash
Replace /bin/bash with the actual path of bash executable.
You can call your script explicitly with bash:
ssh <ssh-opts> bash <scriptname>
This way there will be a ksh executed at login, but inside ksh you start a bash executing your script.

Resources