How to nicely manipulate environment variables via bash scripts? - bash

I want to add some helper commands to my shell. There are several commands I want to add, and they need to share some information between them. However, since I want a different state for each shell, I can't use files to store the shared information, but have to use environment variables.
This opens up the problem of setting environment variables: to change a variable in my shell and not only in a subprocess, I either need to put my commands in scripts and always source the scripts, or define them as functions and source the file via .bashrc.
I have also defined some auxiliary functions that are used by several of my commands, which I would prefer NOT to have in the scope of my main shell process.
I'm somewhat inexperienced with bash, so my question is:
What is the cleanest way to implement this? Should I put my commands into scripts or into functions? Can I prevent my auxiliary functions from being sourced into the main shell? Is there an easier way to manipulate environment variables?

you can store put your environment variables inside shell file (myEnv.sh). then you can use
source myEnv.sh
to load your env variables according to your need.
you can also use inject this into your main shell scripts

I would recommend you to have .profile files instead of .sh files.
case1.profile , case2.profile and source them whenever needed.
use any one of below method to source the files.
source ~/.case1.profile
or
. ~/.case1.profile

Related

not able to source a shell / bash file within the .env file

I tried so many key words:
use source
nothing in front of the path
use
# set -a # automatically export all variables
'./../private/some-cred.sh'
# set +a
None of them allow me to import some private creds into the .env file that will be ran by fastlane Dotenv.overload './config/.env.qa'
Current files:
./config/.env.qa
# source './../private/fastlane-cred.sh'
# APPLE_API_KEY_ID=$APPLE_API_KEY_ID_PRIVATE
./private/some-cred.sh
export APPLE_API_KEY_ID_PRIVATE="AAA"
When this setup got loaded into bitrise (which is in ./ios/) via Dotenv.overload './config/.env.qa'
The value will be considered as empty.
Any idea what else I should do, in order to allow me to load some variables from a file into .env properly so its recognized?
NOTE: path is correct. because I had a file ref defined within .env file and it was loaded in the bitrise ENV var correctly. but not for variables.
What is important to know about the "dotenv" paradigm is that the various "dotenv" implementations are mimicking a shell environment instead of actually being a shell environment. As they are run from inside the application, written in a non-shell language - it is far too late at this point to use the shell to setup environment variables. The "dotenv" libraries aren't even actually changing the process's environment - they just use the runtime's API to store data in such a way that other code that uses the runtime's environment access API will see it as if it's coming from the environment.
The fastlane system uses the dotenv gem which uses an internal parser (based on regular expressions) to parse a file that looks like a shell file (containing only variable assignment) but isn't an actual shell file. Everything there that doesn't look like a naive shell variable assignment is ignored - the dotenv file isn't actually a shell script and isn't treated like a shell script, so you can't put shell scripting commands in it and expect it to work.
If you really want to use shell scripting to setup your environment - then use a wrapper shell script to start your environment (or a Makefile) instead of using the application's internal "dotenv" support - which as I explained above, isn't actually a shell script.

Save Global variables BASH

I am new at bash and trying to solve some issues for a code I'm trying to make.
I am at the terminal under my user name and connect to bash
USER$
USER$ bash
bash$
now in the bash I am saving some variables f.e:
i=2
k=2
let p=$k*$i
now I want to use those variables outside the bash function
bash$exit
USER$
but now the variables are not there
I try using export, but it did not really work, could use ur help, tnx
Not possible. You cannot set environment variables in a parent process like this.
Unlike a DOS batch file, a Unix shell script cannot directly affect the environment of its calling shell.
You could consider using the . (dot) or source command to read and execute the script in the context of the calling shell. This means that changes made in the script do affect the environment (in general; you can still run into issues with sub-shells).
The other alternative is to have the script that sets the variables write the values in name=value format into a file which the calling script then reads (with . or source again).
The conventional solution is to add the settings to your .profile or . bashrc -- which you should use depends on your specific needs and your local Bash configuration; my first recommendation would be .profile, but then you have to avoid any bashisms because this file is shared with sh (so, no let, for example).
For more specific needs, put the commands in a file, and source it when you need it. You might also want to create a simple script to update the file with your current values.
# source this file to update $HOME/stuff
cat<<HERE>$HOME/stuff
i='$i'
k='$k'
p='$p'
export i k p
HERE
The syntax here is quite simple, but assumes you don't have values which can contain single quotes or otherwise free-form content. How to safely store arbitrary values which you don't have complete control over is a much more complex discussion; I am providing a simple solution for the basic use case where you merely need to save a few simple scalar values, like numbers.
To keep your variables when you connect to a remote system, look at the documentation for the tool you are using to connect. For example, ssh has configuration options for importing environment variables from the local system when starting a remote session.

Environment Variable Creation for All Unix Shells

I'm trying to add some export statements in my Unix shell script and up to this point I've only gotten it to work with the bash shell. Is there a way to make the below export apply in all shells using shell scripting?
AXIS2_HOME=/home/user/axis2-1.6.0
export AXIS2_HOME
What do you mean "all shells?"
If you mean different shells as in "can I change my parent/sibling shell's environment"?
Then no, you can't. Exporting a var should mean all your children inherit it though.
You can go some way to faking it by having your script create a temp file that you somehow get the caller to execute, but it's starting to get a biut weird and suggests a problem in your architecture.
If you mean different shells as in sh/bash/csh/tcsh/zsh/ksh etc
You can make something like that work in all "sh" flavour shells, but for "csh" flavours you need to use setenv.
Depending how far you want to go, you could write something to store all your env. vars in a separate file (e.g. env.dat) and convert that to sh/csh syntax using sed/awk/perl.

How do I set bash environment variables from a script?

I have some proxy settings that I only occasionally want to turn on, so I don't want to put them in my ~/.bash_profile. I tried putting them directly in ~/bin/set_proxy_env.sh, adding ~/bin to my PATH, and chmod +xing the script but though the script runs, the variables don't stick in my shell. Does anyone know how to get them to stick around for the rest of the shell session?
Use one of:
source <file>
. <file>
In the script use
export varname=value
and also execute the script with:
source set_proxy_env.sh.
The export keyword ensures the variable is marked for automatic inclusion in the environment of subsequently executed commands. Using source to execute a script starts it with the present shell instead of launching a temporary one for the script.
Did you try this:
. ~/bin/set_proxy_env.sh
Running it by itself opens a separate subshell (I think) and sets the variable there. But then the binding is lost after exiting back into your shell. The dot at the front tells it to run it within the same shell.
Also, don't forget to export the variables you need like so: export MYVAR=value

Can a bash function be used in different scripts?

I've got a function that I want to reference and use across different scripts. Is there any way to do this? I don't want to be re-writing the same function for different scripts. Thanks.
Sure - in your script, where you want to use the function, you can write a command like
source function.sh
which is equivalent to including the contents of function.sh in the file at the point where the command is run. Note that function.sh needs to be in one of the directories in $PATH; if it's not, you need to specify an absolute path.
Yes, you can localize all your functions in a common file (or files). This is exactly what I do with all my utility functions. I have a single utility.shinc in my home directory that's used by all my programs with:
. $HOME/utility.shinc
which executes the script in the context of the current shell. This is important - if you simply run the include script, it will run in a subshell and any changes will not propagate to the current shell.
You can do the same thing for groups of scripts. If it's part of a "product", I'd tend to put all the scripts, and any included scripts, in a single shell directory to ensure everything is localized.
Yes..you can!
Add source function_name in your script.
I prefer to create variable eg.VAR=$(funtion_name),if you add the source function_name after #!/bin/bash then your script first execute imported function task and then your current script task so its better to create variable and used anywhere in script.
thank you..Hope its work for you:)

Resources