I'm trying to add some export statements in my Unix shell script and up to this point I've only gotten it to work with the bash shell. Is there a way to make the below export apply in all shells using shell scripting?
AXIS2_HOME=/home/user/axis2-1.6.0
export AXIS2_HOME
What do you mean "all shells?"
If you mean different shells as in "can I change my parent/sibling shell's environment"?
Then no, you can't. Exporting a var should mean all your children inherit it though.
You can go some way to faking it by having your script create a temp file that you somehow get the caller to execute, but it's starting to get a biut weird and suggests a problem in your architecture.
If you mean different shells as in sh/bash/csh/tcsh/zsh/ksh etc
You can make something like that work in all "sh" flavour shells, but for "csh" flavours you need to use setenv.
Depending how far you want to go, you could write something to store all your env. vars in a separate file (e.g. env.dat) and convert that to sh/csh syntax using sed/awk/perl.
Related
Summary
How can I guarantee that my shell scripts will do what I expect, regardless of the environment?
(Let's assume that people have alias'd and function'd everything they can, but that they haven't touched any system binaries eg. /bin/ls)
Explanation
I am distributing shell scripts as part of an app. These shell scripts are executed in the user's environment - this cannot be changed.
This means users may have aliases for anything and functions redefining "standard" behavior. There have already been a few cases when normal shell keywords have been redefined (eg. local), causing unexpected side effects and crashes.
The only tokens that cannot be defined as functions are as follows:
Bash:
! [[ ]] case coproc do done elif else esac fi for function if in select then time until while { }
ZSH:
! [[ case coproc do done elif else end esac fi for foreach function if nocorrect repeat select then time until while { }
I am aware that:
You can escape a word to skip alias lookup
You can use builtin to always run a builtin
You can use command to always run a command
However, builtin and command can be redefined, so \builtin <command> may not always do what I expect.
Aliases are not expanded in bash scripts (unless you explicitly request this), and functions are usually not inherited by child processes. The caller of your script just has to avoid sourcing it. Problems could be environment variables and file handles.
It is difficult to make a script completely self-containing. For instance, I have seen cases where even standard programs (ls, cat,....) are stored in different locations, which means that if you set up your own PATH and don't know anything about the target platform, you have to apply some heuristics (searching a list of "commonly known directories") and hope that your search is correct.
A more reliable way would be to require from the user of the script to provide a certain minimal configuration (typically containing the basic definition for a PATH) and pass this configuration as parameter to your script.
There is one problem pointed out in the comment by Renaud Pacalet, in that bash allows functions to be exported (using export -f), and in bash, you would have to find out which functions exist, and explicitly remove their definitions (similarily as you would do it with environment variables). However, I see that you have tagged your question by bash and zsh, and if you don't mind, which script language you are using, writing the script in zsh would be perhaps better, because zsh does not have exported functions.
One point to keep in mind is, that every shell, bash and zsh, processes on startup certain files, before the commands in your script have any chance to run. For instance, no matter how you start your zsh, it will always process /etc/zshenv. For instance, if your script at one point invokes a zsh child script too, it would again run /etc/zshenv.
Of course, those startup files could set up functions, and in zsh, aliases are (AFIK) even expanded inside scripts. The strategy would be therefore to initially loop over your environment variables, the currently defined functions, the currently defined aliases (in zsh), and remove those definitions. Then you set up your own definitions (functions, variables).
I want to add some helper commands to my shell. There are several commands I want to add, and they need to share some information between them. However, since I want a different state for each shell, I can't use files to store the shared information, but have to use environment variables.
This opens up the problem of setting environment variables: to change a variable in my shell and not only in a subprocess, I either need to put my commands in scripts and always source the scripts, or define them as functions and source the file via .bashrc.
I have also defined some auxiliary functions that are used by several of my commands, which I would prefer NOT to have in the scope of my main shell process.
I'm somewhat inexperienced with bash, so my question is:
What is the cleanest way to implement this? Should I put my commands into scripts or into functions? Can I prevent my auxiliary functions from being sourced into the main shell? Is there an easier way to manipulate environment variables?
you can store put your environment variables inside shell file (myEnv.sh). then you can use
source myEnv.sh
to load your env variables according to your need.
you can also use inject this into your main shell scripts
I would recommend you to have .profile files instead of .sh files.
case1.profile , case2.profile and source them whenever needed.
use any one of below method to source the files.
source ~/.case1.profile
or
. ~/.case1.profile
On linux systems, it is good practice to prefix shell scripts by a commentary giving the path to the requiered shell to execute.
example:
#!/bin/bash
#or
#!/usr/bin/env bash
This makes the shell syntax awaited explicit (and it is a final touch telling people the script has been reviewed).
But I'm currently writing scripts on iSeries (AS400) where I use qsh.
And I don't know if there is something similar to write on top of my scripts.
Do you know the path to the interpretor? What do you write in your scripts?
I use:
#!/bin/qsh
In PASE (call qp2term) you can use
#!/bin/sh
Let's say a script is called with /bin/sh. Is it possible to source another script from that script and to have it be interpreted with #!/bin/bash?
It would appear that the #!/bin/bash doesn't do anything...
And by source, at this point I am meaning the functionality of manipulating the parent environment.
No. The whole point of sourcing a script is that the script is interpreted by the shell doing the sourcing. If, as is often the case, /bin/sh is bash, then you will get the desired behavior. Otherwise, you are out of luck.
Try the source command, or dot operator. You might also try the env command. Note, make sure you export if you're using source (or dot).
I am new at bash and trying to solve some issues for a code I'm trying to make.
I am at the terminal under my user name and connect to bash
USER$
USER$ bash
bash$
now in the bash I am saving some variables f.e:
i=2
k=2
let p=$k*$i
now I want to use those variables outside the bash function
bash$exit
USER$
but now the variables are not there
I try using export, but it did not really work, could use ur help, tnx
Not possible. You cannot set environment variables in a parent process like this.
Unlike a DOS batch file, a Unix shell script cannot directly affect the environment of its calling shell.
You could consider using the . (dot) or source command to read and execute the script in the context of the calling shell. This means that changes made in the script do affect the environment (in general; you can still run into issues with sub-shells).
The other alternative is to have the script that sets the variables write the values in name=value format into a file which the calling script then reads (with . or source again).
The conventional solution is to add the settings to your .profile or . bashrc -- which you should use depends on your specific needs and your local Bash configuration; my first recommendation would be .profile, but then you have to avoid any bashisms because this file is shared with sh (so, no let, for example).
For more specific needs, put the commands in a file, and source it when you need it. You might also want to create a simple script to update the file with your current values.
# source this file to update $HOME/stuff
cat<<HERE>$HOME/stuff
i='$i'
k='$k'
p='$p'
export i k p
HERE
The syntax here is quite simple, but assumes you don't have values which can contain single quotes or otherwise free-form content. How to safely store arbitrary values which you don't have complete control over is a much more complex discussion; I am providing a simple solution for the basic use case where you merely need to save a few simple scalar values, like numbers.
To keep your variables when you connect to a remote system, look at the documentation for the tool you are using to connect. For example, ssh has configuration options for importing environment variables from the local system when starting a remote session.