Env variables not being picked up by script - bash

Creating a script to pass to a few different people and ran into an env problem. The script wouldn't run unless I supplied it with $PATH, $HOME, and $GOPATH at the beginning of the file. Like so:
HOME=/home/Hustlin
PATH=/home/Hustlin/bin:/home/Hustlin/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/local/go/bin:/bin:/home/Hustlin/go/bin
export GOPATH=$HOME/go
export PATH=$PATH:$GOROOT/bin:$GOPATH/bin
This is not advantageous when trying to pass the script around and each person has to set these variables themselves. This file would rarely be run by the User and would most often be run via crontab.
I would love to hear a better way of coding this so I'm not asking everyone I send the script to update these variables.
Thank you all in advance!!!
EDIT
The script is being run via crontab with no special permissions.
1,16,31,46 * * * * /home/Hustlin/directory1/super_cool_script.sh
Here is the script I am running:
#!/bin/bash
# TODO Manually put your $PATH and $HOME here.
PATH=/home/Hustlin/bin:/home/Hustlin/.local/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/snap/bin:/usr/local/go/bin:/bin:/home/Hustlin/go/bin
HOME=/home/Hustlin
export GOPATH=$HOME/go
export PATH=$PATH:$GOROOT/bin:$GOPATH/bin
# Field1
field1="foo"
# Welcome message.
echo Starting the update process...
# Deposit directory.
mkdir -p $HOME/directory1/sub1/data/body
mkdir -p $HOME/directory1/sub2/system
# Run command
program1 command1
# Run longer command.
program1 command2 field1
sleep 3
program1 command3 -o $HOME/directory1/sub1/data $field1
sleep 1
# Unzip and discard unnecessary files.
unzip $HOME/directory1/sub1/data/$field1 -d $HOME/directory1/sub1/data
rm $HOME/directory1/sub1/data/bar.yaml $HOME/dircetory1/sub1/data/char.txt
rm $HOME/directory1/sub1/data/$field1.zip
# Rename
mv $HOME/directory1/sub1/data/body.json $HOME/directory1/sub1/data/body/$(date -d '1 hour ago' +%d-%m-%Y_%H).json
echo Process complete.
I changed most of the program and command names for privacy. What I did post still represents what is being done and how the files are being moved.

The issue is crontab, not the script.
When you run the script on your terminal, you are logged in a session with all environment variables set, so the script can use it.
But when you run it from crontab it an "empty" session, so it does not have any environment variable set, it doesn't even know about your user.
Run the script on crontab like this:.
su --login Hustlin /home/Hustlin/directory1/super_cool_script.sh
Check this documentation.
http://man7.org/linux/man-pages/man1/su.1.html

bash -l -c /path/to/script will make bash execute all .bashrc and .profile files first, so it will have HOME and PATH variables set.

Related

Reading "bash_profile" is doing two things that are contradicting each other

I'm very confused about how my shell is reading bash_profile.
In root, my ~/.bash_profile looks like so
# .bash_profile
# Get the aliases and functions
if [-f ~/.bashrc ]; then
.~/.bashrc
fi
PATH=$PATH:$HOME/bin:$HOME/sbin:$HOME/usr/sbin:$HOME/usr/bin:/usr/sbin
LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/usr/local/lib
export PATH=$PATH
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH
unset USERNAME
There is no ~/.profile file.
In a user called maruhan, my ~/.bash_profile looks like so
# .bash_profile
# Get the aliases and functions
if [-f ~/.bashrc ]; then
.~/.bashrc
fi
PATH=$PATH:$HOME/bin:$HOME/sbin:$HOME/usr/sbin:$HOME/usr/bin:/usr/sbin
LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/home/maruhan/Desktop/issac:/usr/local/lib
ASDF=$ASDF:/home
export PATH=$PATH
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH
export ASDF=$ASDF
unset USERNAME
And my ~/.profile looks like so
LD_LIBRARY_PATH=/home/maruhan/Desktop/issac:/usr/local/lib:$LD_LIBRARY_PATH
ASDF=/home:$ASDF
export ASDF=$ASDF
export LD_LIBRARY_PATH=$LD_LIBRARY_PATH
You can clearly see that ASDF is not defined in root's bash_profile.
However when I call export, I get this in root.
declare -x ASDF=":/home"
but nothing about LD_LIBRARY_PATH.
Strangely in maruhan, running export shows both ASDF and LD_LIBRARY_PATH.
Also, nothing about ASDF or LD_LIBRARY_PATH exist in /etc/environment. I also don't have a /etc/bash_profile file.
Running echo $0 gives me bash for both root and maruhan.
How come LD_LIBRARY_PATH disappeared in root while ASDF is there?
The rules are a bit complicated. According to bash's man page:
INVOCATION
A login shell is one whose first character of argument zero is a -, or one
started with the --login option.
An interactive shell is one started without non-option arguments (unless -s is
specified) and without the -c option whose standard input and error are both
connected to terminals (as determined by isatty(3)), or one started with the -i
option. PS1 is set and $- includes i if bash is interactive, allowing a shell
script or a startup file to test this state.
... ...
When bash is invoked as an interactive login shell, or as a non-interactive
shell with the --login option, it first reads and executes commands from the
file /etc/profile, if that file exists. After reading that file, it looks for
~/.bash_profile, ~/.bash_login, and ~/.profile, in that order, and reads and
executes commands from the first one that exists and is readable. The --noprofile
option may be used when the shell is started to inhibit this behavior.
... ...
When an interactive shell that is not a login shell is started, bash reads and
executes commands from ~/.bashrc, if that file exists. This may be inhibited by
using the --norc option. The --rcfile file option will force bash to read and
execute commands from file instead of ~/.bashrc.
... ...
Note that on some systems bash may be customized so that it would also execute a system wide rc file (e.g. /etc/bash.bashrc) before sourcing ~/.bashrc for an interactive shell that's not a login shell.
Shells started by a login mechanism (usually with a username/password prompt, like console login, telnet, ssh, ...) are usually login shells. For a login shell, $0 is usually -bash.
[local] % ssh user#host <-- The user is trying to login
Password: P#ssw0rd
[remote] % echo $0
-bash <-- This is a login shell
[remote] % bash <-- This is not a login (no username/password)
[remote] % echo $0
bash <-- Not a login shell
[remote] %
To make life easier I would put all rc things in ~/.bashrc and source ~/.bashrc in ~/.bash_profile. For example:
% cat ~/.bash_profile
[[ -f ~/.bashrc ]] && source ~/.bashrc
% cat ~/.bashrc
# return immediately if not in an interactive shell
[[ $- != *i* ]] && return 0
export FOO=bar
PATH=$PATH:/my/path
%

Sending Bash Aliases to detached screen sessions

I'm on a Linux machine using screen, and I'm attempting to write a (fairly portable) function which runs a bash function in a new, detached screen session which automatically closes upon completion. I've had some success, but I noticed the following behavior:
If I include the definition of mail_submit() in my ~/.bashrc file, I can run
mail_submit foo
in the terminal, and also I can access the alias in a new screen session:
screen -S test
mail_submit foo
However, the following command does not work:
screen -d -m -S test sh -c 'mail_submit foo'
presumably because sh -c starts a fresh shell that has no knowledge of my ~/.bashrc profile. So, I can use the following fix:
screen -d -m -S test sh -c 'source ~/.bashrc; mail_submit foo'
which does work.
But if I want to wrap this functionality up into a bash alias (which is my ultimate goal here), this will cause a weird self-referential situation.
Question: What is an easy way to either have sh -c know the location of my ~/.bashrc profile, or use a variant of sourcing the file and creating an alias?
EDIT: I could save the shell script in my home directory, and create an alias which runs
screen -d -m -S test bash -c '~/mail_submit.sh $1'
but I'd still be curious to hear other possible fixes.
A default ~/.bashrc contains this ([[ "$-" != *i* ]] && return) little piece of code on top of it (or somewhere else in the upper part). This line will prevent the ~/.bashrc from beeing sourced if the bash shell doesn't run in interactive mode.
You could:
Remove this line
Create a new file which will only contain the alias you need and source that
Create a little bash script instead of an alias and run that
Do you mean screen -d -m -S test bash -c 'mail_submit foo'?
It looks like you're trying to run the command with the shell (sh), and not the bourne again shell (bash), which is the shell interpreter which actually reads the ~/.bashrc profile.
Edit: The .bashrc file is not being sourced by default because screen does not create the bash process as a login shell, which is when the .bashrc file is read. Creating a .screenrc file with the line defshell -bash will create the bash process as a login shell instead, which will then call the .bashrc file.

How to properly access network location while executing bash script in cygwin's cron

I've created a bash script to take a backup of a folder to a remote location via cygwin cron however I'm experiencing an issue. The script at the end will execute a command like this one
/usr/bin/tar -zcvf //192.168.1.108/Backup/Folder/Folder.Backup.2015-12-03.1219.tar.gz /cygdrive/d/Folder
Although when I use the command it produces and then executes in the context of a cygwin bash shell it works correctly, when I run it via a cron job it fails because it doesn't recognize the remote location path correctly. If I change the path to a local /cygdrive location or to ~/ it works correctly even via cron so somehow I'm thinking that the network shares are not being correctly viewed by cygwin in it's cron environment.
Any ideas how I could solve this issue?
Here's my bash script
#!/usr/bin/bash
#the path needs to be set to execute gzip command or tar command breaks
export PATH=$PATH:/usr/bin:/bin:/usr/local/bin:/usr/local/sbin:/sbin
if [ $# -ne 3 ]
then
echo "USAGE: backup-clients <path> <name_prefix> <source>";
exit 1;
fi
DATE=`date "+%Y-%m-%d.%H%M"`;
FILEPATH="$1/$2.Backup.$DATE.tar.gz";
COMMAND="/usr/bin/tar -zcvf $FILEPATH $3";
echo "COMMAND="$COMMAND;
eval $COMMAND;
Which I run with the command
/usr/bin/bash /cygdrive/d/mybackupscript.bash "//192.168.1.108/Backup/Folder" "Folder" "/cygdrive/d/Folder"
I really appreciate any help you can provide.

Script to change the directory path

I was trying the below program,
This is a simple script, to cd into a folder
#! /bin/bash
cd /root/
But this below command , doesnt get into the folder
EDITED
#!/bin/bash
alias ex="cd /fs/fm"
alias ex1="source setenv"
alias ex2="cd /fs/fm/tests"
alias ex3="runtest"
To get into /root/ you should make sure that you have permissions. It's accessible if you're running as root itself but if you're running as a normal user you should consider becoming root first. One way is to use sudo:
sudo bash script.sh
And again, make sure your script is in UNIX format. Certainly you can't change to /root/\r.
sed -i 's|\r||' script.sh
dos2unix script.sh
This will never work The script you're running is a separate process, when it finishes you get back to the original environment (cwd, enviroment variables, etc...).
Create an alias:
alias r="cd /root"
or execute the script within your shell:
. myscript
Note: . is a synonym for source.

Stay in directory changed after ending of bash script

My bash script:
#!/bin/bash
cd /tmp
Before running my script:
pwd: /
After running my script:
pwd: /
After runnig my script trough sourcing it:
pwd: /tmp
How I can stay at the path from the script without sourcing it ?
You can't. Changes to the current directory only affect the current process.
Let me elaborate a little bit on this:
When you run the script, bash creates a new process for it, and changes to the current directory only affect that process.
When you source the script, the script is executed directly by the shell you are running, without creating extra processes, so changes to the current directory are visible to your main shell process.
So, as Ignacio pointed out, this can not be done
Ignacio is correct. However, as a heinous hack (totally ill advised and this really should get me at least 10 down votes) you can exec a new shell when you're done
#!/bin/bash
...
cd /
exec bash
Here's a silly idea. Use PROMPT_COMMAND. For example:
$ export PROMPT_COMMAND='test -f $CDFILE && cd $(cat $CDFILE) && rm $CDFILE'
$ export CDFILE=/tmp/cd.$$
Then, make the last line of your script be 'pwd > $CDFILE'
If you really need this behavior, you can make your script return the directory, then use it somehow. Something like:
#!/bin/bash
cd /tmp
echo $(pwd)
and then you can
cd $(./script.sh)
Ugly, but does the trick in this simple case.
You can define a function to run in the current shell to support this. E.g.
md() { mkdir -p "$1" && cd "$1"; }
I have the above in my ~/.bashrc

Resources