I want to set an http proxy within the bash environment (export http_proxy=xyz). So I added the command to the end of the .bash_profile and called
exec /bin/sh -c "source /path/to/.bash_profile"
But it does not work like expected: $::env(http_proxy) does not exist (but there is no typo).
I also tried to run the script like that: exec /bin/sh -c [exec cat /path/to/.bash_profile] .. but with the same result.
Saying
exec /bin/sh -c "source /path/to/.bash_profile"
would source the /path/to/.bash_profile in a subshell. So any changes made to the environment are effectively ignored when the command is done executing.
In order to pass an environment variable to a program, try:
exec /usr/bin/env http_proxy=xyz program
Related
I am creating new tab to run my command, and kill it when not needed.
$ roxterm --tab -e top
$ pkill -f top
(gnome-terminal is preferred, but doesn't support new tab to execute given command; splitting open & run fails to pkill).
My command requires a setup.sh script setting environment variables to be called prior to execution. However,
$ roxterm --tab -e "source setup.sh; mycommand"
fails, with the error line 1: source not found, as the bash environment is not yet initialized in the just being created roxterm env. How to circumvent.
One option i can think of is to create below script, myscript.sh
source setup.sh
mycommand
but i'd like to avoid creating scripts for each of my commands, and be able to kill it with pkill -f mycommand
I am switching from bash to fish, but am having trouble porting over a convenience function I use often. The point of this function is to run make from the root directory of my source tree regardless of which directory my shell is currently in.
In bash, this was simply:
function omake {(
cd $SOURCE_ROOT;
make $#;
)}
Since fish doesn't have subshells, the best I've been able to do is:
function omake
pushd
cd $SOURCE_ROOT
make $argv
popd
end
This works, but with the caveat that after interrupting the fish version with ^C, the shell is still in $SOURCE_ROOT, but interrupting the bash version puts me back in the original directory.
Is there a way to write a script that works identically to the bash one in fish?
This is as close as I can get to a subshell:
function omake
echo "cd $SOURCE_ROOT; and make \$argv" | fish /dev/stdin $argv
end
Process substitution does not seem to be interruptable: Ctrl-C does not stop this sleep cmd
echo (cd /tmp; and sleep 15)
However, fish has a very nice way to find the pid of a backgrounded process:
function omake
pushd dir1
make $argv &
popd
end
Then, to stop the make, instead of Ctrl-C, do kill %make
if you're using GNU coreutils, you can use env to do this (as user2394284 suggested would be nice).
env -C foo pwd
this will run pwd in a subdirectory called foo nicely. this generally interacts nicely with fish, for example it can be backgrounded nicely
the docs say:
Change the working directory to dir before invoking command. This differs from the shell built-in cd in that it starts command as a subprocess rather than altering the shell’s own working directory; this allows it to be chained with other commands that run commands in a different context.
For make specifically, you can use its -C option:
function omake
make -C $SOURCE_ROOT $argv
end
Many other programs take the -C option. On top of my head: ninja, git
Otherwise, if you're content with a subshell anyway, a standalone script gives you just that, with the benefit that you can write it in any language, from Python to C, and won't have to rewrite it when you want to change your shell.
This could be written in any language:
#!/usr/bin/env fish
cd $SOURCE_ROOT
and exec make $argv
However, what really bugs me is the blatant omission of the -C option in env! Env can set environment variables and run programs; clearly, it would be nice to be able to set the working directory too! Here is a rudimentary cdrun script to make up for that:
#!/usr/bin/env fish
cd $argv[1]
and exec env $argv[2..-1]
or
#!/bin/sh
set cd="$1"
shift
cd "$cd" &&
exec env "$#"
Armed with the cdrun command, your omake function would be as simple as:
function omake
cdrun $SOURCE_ROOT make $argv
end
or
omake() {
cdrun "$SOURCE_ROOT" make "$#"
}
My default shell is bash. I have set some environment variables in my .bashrc file.
I installed a program which use .cshrc file. It contains the path to several cshell scripts.
When I run the following commands in the shell windows it works perfectly :
exec csh
source .cshrc
exec bash
I have tried to put these commands in bash script, unfortunately it didn't work.
is there another way to write a script in order to get the same result as running commands from a shell windows.
I hope my question is now clear
Many thanks for any help
WARNING : don't put the following script in your .bashrc, it will reload bash and so reload .bashrc again and again (stopable with C-c anyway)
Use preferable this script in your kit/CDS stuff startup script. (cadence presumably)
WARNING 2 : if anything in your file2source fails, the whole 'trick' stops.
Call this script : cshWrapper.csh
#! /bin/csh
# to launch using
# exec cshWrapper.csh file2source.sh
source $1
exec $SHELL -i
and launch it using
exec ./cshWrapper.csh file2source.sh
it will : launch csh, source your file and came back to the same parrent bash shell
Example :
$> ps
PID TTY TIME CMD
7065 pts/0 00:00:02 bash
$>exec ./cshWrapper.csh toggle.csh
file sourced
1
$> echo $$
7065
where in my case i use the file toggle.csh
#! /bin/csh
# source ./toggle.csh
if ! $?TOGGLE then
setenv TOGGLE 0
endif
if ($?TOGGLE) then
echo 'file sourced'
if ($TOGGLE == 0) then
setenv TOGGLE 1
else
setenv TOGGLE 0
endif
endif
echo $TOGGLE
Hope it helps
New proposal, since I faced another problem with exec.
exec kills whatever remains in the script, except if you force a fork by using a pipe after it `exec script |cat'. In such case if you have environment variable in the script, they are not spread back to the script itself, which is not what we want. The only solution I found is to use 3 files (let's call them for the example : main.bash that call first.cshrc and second.sh).
#! /bin/bash
#_main.bash_
exec /bin/csh -c "source /path_to_file/cshrc; exec /bin/bash -i -c /path_to_file/second.sh"
# after exec nothing remains (like Attila the Hun)
# the rest of the script is in 'second.sh'
With that manner, i can launch in a single script call, an old cshrc design kit, and still process some bash command after, and finally launch the main program in bash (let say virtuoso)
I would like to spawn a bash subshell from a bash script that allows me to initialize environment variables and do some tasts so that when the script ends the user is in the subshell with the initialized environment variables.
By testing I found out that if I have the instruction /bin/bash in the script, the subshell is spawned and the user is in the subshell when the script ends. If I execute exit, the subshell terminates and the user is back in the parent shell.
I now would like to be able to initialize environment variables and do some tasks in the subshell based on arguments given in the script.
How could I achieve that ?
exec is what you are looking for
$ cat reshell.sh
#!/bin/bash
export MY_ENVIRONMENT=foo
chdir /home/bar
exec $SHELL
where the exec replaces the currently running script with a new shell that has inherited the environment of the script. This will still be a subordinate shell to the one that you ran it from, but it will be only one layer deep; for example:
$ pwd
/home/chmike
$ ./reshell.sh
$ pwd
/home/bar
$ exit
$ pwd
/home/chmike
Using exec $SHELL instead of exec /bin/bash allows the script to invoke the user's preferred shell in case it isn't bash.
export all the environmental variables in the script and then spawn a subshell.
The subshell would have all the environmental variables set.
export EXAMPLE="<value>"
...
#spawn a subshell
/bin/bash
Is it possible to source a .bshrc file from .cshrc in a non-interactive session?
I'm asking because tcsh is our default shell at work and the .cshrc has to be used to set up the environment initially.
However, I am not really familiar with the tcsh and I have my own set-up in bash, so right now I have the following lines at the end of my .cshrc file:
if ( $?prompt && -x /bin/bash) then
exec /bin/bash
endif
This works fine, loading my environment from .bashrc and giving me a bash prompt for interactive sessions but now I also need the same set-up for non-interactive sessions, e.g. to run a command remotely via SSH with all the correct PATHs etc.
I can't use 'exec' in that case but I can't figure out how to switch to bash and load the bash config files "non-interactively".
All our machines share the same home directory, so any changes to my local *rc files will affect the remote machiens as well.
Any ideas welcome - thank you for your help!
After some more research I'm now quite sure that this won't work, but of course feel free to prove me wrong!
To load the environment in bash I have to switch to a bash shell. Even if that is possible "in the background", i.e. without getting a prompt, it would still break any tcsh commands which would then be attempted to execute under bash.
Hmmmm, back to the drawing board...
If $command is set there are arguments to csh, so it is a remote shell command. This works for me in .cshrc:
if ($?command) then
echo Executing non-interactive command in bash: $command $*
exec /bin/bash -c "${command} $*"
endif
echo Interactive bash shell
exec bash -l
Test:
$ ssh remotehost set | grep BASH
BASH=/bin/bash
...
proves that it ran in Bash.