How to get values back from an external function that requires interactivity? - bash

One of the routines I frequently use is a check for valid arguments passed when invoking scripts. Ideally, I'd like to make these, and other, similar, routines external functions that I could call from any script, for handling these more trivial processes. But, I'm having trouble retrieving the values I need from said function(s), without making the process more complicated.
I have tried using command substitution (e.g., echoing the output of the external function into a variable name local to the calling script), which seems to at least work with simpler functions. However, working with this file checking function, requires the read command in a loop, and, thus, user interactivity, which causes the script to hang when trying to resolve the variable that function call is stored in:
#!/bin/bash
# This is a simple function I want to call from other scripts.
exist(){
# If the first parameter passed is not a directory, then the input is
#+ invalid.
if [ ! -d "$1" ]; then
# Rename $1, so we can manipulate its value.
userDir="$1"
# Ask the user for new input while his input is invalid.
while [ ! -d "$userDir" ]; do
echo "\"$userDir\" does not exist."
echo "Enter the path to the directory: "
read userDir
# Convert any tildes in the variable b/c the shell didn't get to
#+ perform expansion.
userDir=`echo "$userDir" | sed "s|~|$HOME|"`
done
fi
}
exist "$1"
How can I retrieve the value of userDir in the calling script without adding (much) complexity?

You can have the exist function interact with the user over stderr and still capture the variable with command substitution. Let's take a simplified example:
exist() { read -u2 -p "Enter dir: " dir; echo "$dir"; }
The option -u2 tells read to use file descriptor 2 (stderr) for interacting with the user. This will continue to work even if stdout has been redirected via command substitution. The option -p "Enter dir: " allows read to set the prompt and capture the user input in one command.
As an example of how it works:
$ d=$(exist)
Enter dir: SomeDirectory
$ echo "$d"
SomeDirectory
Complete example
exist() {
local dir="$1"
while [ ! -d "$dir" ]; do
echo "'$dir' is not a directory." >&2
read -u2 -p "Enter the path to the directory: " dir
dir="${dir/\~/$HOME}"
done
echo "$dir"
}
As an example of this in use:
$ d=$(exist /asdf)
'/asdf' is not a directory.
Enter the path to the directory: /tmp
$ echo "new directory=$d"
new directory=/tmp
Notes:
There is no need for an if statement and a while loop. The while is sufficient on its own.
Single quotes can be put in double-quoted strings without escapes. So, if we write the error message as "'$dir' is not a directory.", escapes are not needed.
All shell variables should be double-quoted unless one wants them to be subject to word splitting and pathname expansion.

Right off the bat I'd say you can 'echo' to the user on stderr and echo your intended answer on stdout.
I had to rearrange a bit to get it working, but this is tested:
exist(){
# If the first parameter passed is not a directory, then the input is
#+ invalid.
userDir="$1"
if [ ! -d "$userDir" ]; then
# Ask the user for new input while his input is invalid.
while [ ! -d "$userDir" ]; do
>&2 echo "\"$userDir\" does not exist."
>&2 echo "Enter the path to the directory: "
read userDir
done
else
>&2 echo "'$1' is indeed a directory"
fi
echo "$userDir"
}
When I tested, I saved that to a file called exist.inc.func
Then I wrote another script that uses it like this:
#!/bin/sh
source ./exist.inc.func
#Should work with no input:
varInCallingProg=$(exist /root)
echo "Got back $varInCallingProg"
#Should work after you correct it interactively:
varInCallingProg2=$(exist /probablyNotAdirOnYourSystem )
echo "Got back $varInCallingProg2"

Related

Shell Script check if file exists, and has read permissions

In my shell script i'm trying to check if a specific file is exists and if it has reading permissions.
My file's path has spaces in it.
I quoted the file path:
file='/my/path/with\ some\ \spaces/file.txt'
This is the function to check if the file exists:
#Check if file exists and is readable
checkIfFileExists() {
#Check if file exists
if ! [ -e $1 ]; then
error "$1 does not exists";
fi
#Check if file permissions allow reading
if ! [ -r $1 ]; then
error "$1 does not allow reading, please set the file permissions";
fi
}
Here I double quote to make sure it gets the file as a one argument:
checkIfFileExists "'$file'";
And I receive an error from the bash saying:
[: too many arguments
Which make me thinks it doesn't get it as a one argument.
But in my custom error, I do get the whole path, and it says it doesn't exists.
Error: '/my/path/with\ some\ \spaces/file.txt' does not exists
Although it does exists, and when I tried to read it with "cat $file" I get a permission error..
what am I'm doing wrong?
The proper way to quote when you require variable interpolation is with double quotes:
if [ -e "$1" ]; then
You need similar quoting throughout the script, and the caller needs to quote or escape the string -- but not both. When you assign it, use one of these:
file='/my/path/with some spaces/file.txt'
# or
file=/my/path/with\ some\ spaces/file.txt
# or
file="/my/path/with some spaces/file.txt"
then use double quotes around the value to pass it in as a single argument:
checkIfFileExists "$file"
Again, where you need the variable's value to be interpolated, use double quotes.
For a quick illustration of what these quotes do, try this:
vnix$ printf '<<%s>>\n' "foo bar" "'baz quux'" '"ick poo"' \"ick poo\" ick\ poo
<<foo bar>>
<<'baz quux'>>
<<"ick poo">>
<<"ick>>
<<poo">>
<<ick poo>>
Furthermore, see also When to wrap quotes around a shell variable?
if [[ -e $1 ]];then
echo it exists
else
echo it doesnt
fi
if [[ -r $1 ]];then
echo readable
else
echo not readable
fi

Indirect reference in bash, why isn't this working?

I'm trying to tidy up one of my bash scripts by using a function for something that happens 6 times. The script sets a number of variables from a config.ini file and then lists them and asks for confirmation that the user wishes to proceed with these predefined values. If not, it steps through each variable and asks for a new one to be entered (or to leave it blank and press enter to use the predefined value). This bit of code accomplishes that:
echo Current output folder: $OUTPUT_FOLDER
echo -n "Enter new output folder: "
read C_OUTPUT_FOLDER
if [ -n "$C_OUTPUT_FOLDER" ]; then OUTPUT_FOLDER=$C_OUTPUT_FOLDER; fi
The idea is to set $OUTPUT_FOLDER to the value of $C_OUTPUT_FOLDER but only if $C_OUTPUT_FOLDER is not null. If $C_OUTPUT_FOLDER IS null, it will not do anything and leave $OUTPUT_FOLDER as it was for use later in the script.
There are 6 variables that are set from the config.ini so this block is currently repeated 6 times. I've made a function new_config () which is as follows:
new_config () {
echo Current $1: ${!2}
echo -n "Enter new $1: "
read $3
if [ -n "${!3}" ]; then $2=${!3}; fi
}
I'm calling it with (in this instance):
new_config "output folder" OUTPUT_FOLDER C_OUTPUT_FOLDER
When I run the script, it has an error on the if line:
./test.sh: line 9: OUTPUT_FOLDER=blah: command not found
So, what gives? The block of code in the script works fine and (in my quite-new-to-bash eyes), the function should be doing exactly the same thing.
Thanks in advance for any pointers.
The problem is that bash splits the command into tokens before variable substitution, see http://tldp.org/LDP/Bash-Beginners-Guide/html/sect_01_04.html#sect_01_04_01_01. Specifically there's rules for POSIX shells that make assignments a special case for tokenization: "If all the characters preceding '=' form a valid name (see XBD Name), the token ASSIGNMENT_WORD shall be returned." - it's the ASSIGNMENT_WORD token that triggers the assignment path. It doesn't repeat the tokenization after variable substitution, which is why your code doesn't work.
You can get your code to work like so:
new_config () {
echo Current $1: ${!2}
echo -n "Enter new $1: "
read $3
if [[ -n "${!3}" ]]; then echo setting "$2='${!3}'"; eval "$2='${!3}'"; fi
}
new_config "output folder" OUTPUT_FOLDER C_OUTPUT_FOLDER
echo $OUTPUT_FOLDER
As #chepner points out, you can use declare -g $2="${!3}" instead of eval here, and on newer bash versions that's a better answer. Unfortunately declare -g requires bash 4.2, and even though that's 3 years old it's still not everywhere - for example, OS X Mavericks is stuck on 3.2.51.

What is wrong with my bash script?

What I have to to is edit a script given to me that will check if the user has write permission for a file named journal-file in the user's home directory. The script should take appropriate actions if journal-file exists and the user does not have write permission to the file.
Here is what I have written so far:
if [ -w $HOME/journal-file ]
then
file=$HOME/journal-file
date >> file
echo -n "Enter name of person or group: "
read name
echo "$name" >> $file
echo >> $file
cat >> $file
echo "--------------------------------" >> $file
echo >> $file
exit 1
else
echo "You do not have write permission."
exit 1
fi
When I run the script it prompt me to input the name of the person/group, but after I press enter nothing happens. It just sits there allowing me to continue inputting stuff and doesn't continue past that part. Why is it doing this?
The statement:
cat >>$file
will read from standard input and write to the file. That means it will wait until you indicate end of file with something like CTRL-D. It's really no different from just typing cat at a command line and seeing that nothing happens until you enter something and it waits until you indicate end of file.
If you're trying to append another file to the output file, you need to specify its name, such as cat $HOME/myfile.txt >>$file.
If you're trying to get a blank line in there, use echo rather than cat, such as echo >>$file.
You also have a couple of other problems, the first being:
date >> file
since that will try to create a file called file (in your working directory). Use $file instead.
The second is the exit code of 1 in the case where what you're trying to do has succeeded. That may not be a problem now but someone using this at a later date may wonder why it seems to indicate failure always.
To be honest, I'm not really a big fan of the if ... then return else ... construct. I prefer fail-fast with less indentation and better grouping of output redirection, such as:
file=${HOME}/journal-file
if [[ ! -w ${file} ]] ; then
echo "You do not have write permission."
exit 1
fi
echo -n "Enter name of person or group: "
read name
(
date
echo "$name"
echo
echo "--------------------------------"
echo
) >>${file}
I believe that's far more readable and maintainable.
It's this line
cat >> $file
cat is concatenating input from standard input (ie whatever you type) to $file
I think the part
cat >> $file
copies everything from stdin to the file. Maybe if you hid Ctrl+D (end of file) the script can continue.
1) You better check first whether the file exists or not:
[[ -e $HOME/journal-file ]] || \
{ echo "$HOME/journal-file does not exist"; exit 1 }
2) You gotta change "cat >> $file" for whatever you want to do with the file. This is the command that is blocking the execution of the script.

Parsing command output in bash to variables

I have a number of bash scripts, each doing its own thing merrily. Do note that while I program in other languages, I only use Bash to automate things, and am not very good at it.
I'm now trying to combine a number of them to create "meta" scripts, if you will, which use other scripts as steps. The problem is that I need to parse the output of each step to be able to pass a part of it as params to the next step.
An example:
stepA.sh
[...does stuff here...]
echo "Task complete successfuly"
echo "Files available at: $d1/$1"
echo "Logs available at: $d2/$1"
both the above are paths, such as /var/www/thisisatest and /var/log/thisisatest (note that files always start with /var/www and logs always start with /var/log ). I'm only interested in the files path.
steB.sh
[...does stuff here...]
echo "Creation of $d1 complete."
echo "Access with username $usr and password $pass"
all variables here are simple strings, that may contain special characters (no spaces)
What I'm trying to build is a script that runs stepA.sh, then stepB.sh and uses the output of each to do its own stuff. What I'm currently doing (both above scripts are symlinked to /usr/local/bin without the .sh part and made executable):
#!/bin/bash
stepA $1 | while read -r line; do
# Create the container, and grab the file location
# then pass it to then next pipe
if [[ "$line" == *:* ]]
then
POS=`expr index "$line" "/"`
PTH="/${line:$POS}"
if [[ "$PTH" == *www* ]]
then
#OK, have what I need here, now what?
echo $PTH;
fi
fi
done
# Somehow get $PTH here
stepB $1 | while read -r line; do
...
done
#somehow have the required strings here
I'm stuck in passing the PTH to the next step. I understand this is because piping runs it in a subshell, however all examples I've seen refer to to files and not commands, and I could not make this to work. I tried piping the echo to a "next step" such as
stepA | while ...
echo $PTH
done | while ...
#Got my var here, but cannot run stuff
done
How can I run stepA and have the PTH variable available for later?
Is there a "better way" to extract the path I need from the output than nested ifs ?
Thanks in advance!
Since you're using bash explicitly (in the shebang line), you can use its process substitution feature instead of a pipe:
while read -r line; do
if [[ "$line" == *:* ]]
.....
fi
done < <(stepA $1)
Alternately, you could capture the command's output to a string variable, and then parse that:
output="$(stepA $1)"
tmp="${output#*$'\nFiles available at: '}" # output with everything before the filepath trimmed
filepath="${tmp%%$'\n'*}" # trim the first newline and everything after it from $tmp
tmp="${output#*$'\nLogs available at: '}"
logpath="${tmp%%$'\n'*}"

How can I reference a file for variables using Bash?

I want to call a settings file for a variable. How can I do this in Bash?
The settings file will define the variables (for example, CONFIG.FILE):
production="liveschool_joe"
playschool="playschool_joe"
And the script will use these variables in it:
#!/bin/bash
production="/REFERENCE/TO/CONFIG.FILE"
playschool="/REFERENCE/TO/CONFIG.FILE"
sudo -u wwwrun svn up /srv/www/htdocs/$production
sudo -u wwwrun svn up /srv/www/htdocs/$playschool
How can I get Bash to do something like that? Will I have to use AWK, sed, etc.?
The short answer
Use the source command.
An example using source
For example:
config.sh
#!/usr/bin/env bash
production="liveschool_joe"
playschool="playschool_joe"
echo $playschool
script.sh
#!/usr/bin/env bash
source config.sh
echo $production
Note that the output from sh ./script.sh in this example is:
~$ sh ./script.sh
playschool_joe
liveschool_joe
This is because the source command actually runs the program. Everything in config.sh is executed.
Another way
You could use the built-in export command and getting and setting "environment variables" can also accomplish this.
Running export and echo $ENV should be all you need to know about accessing variables. Accessing environment variables is done the same way as a local variable.
To set them, say:
export variable=value
at the command line. All scripts will be able to access this value.
Even shorter using the dot (sourcing):
#!/bin/bash
. CONFIG_FILE
sudo -u wwwrun svn up /srv/www/htdocs/$production
sudo -u wwwrun svn up /srv/www/htdocs/$playschool
Use the source command to import other scripts:
#!/bin/bash
source /REFERENCE/TO/CONFIG.FILE
sudo -u wwwrun svn up /srv/www/htdocs/$production
sudo -u wwwrun svn up /srv/www/htdocs/$playschool
in Bash, to source some command's output, instead of a file:
source <(echo vara=3) # variable vara, which is 3
source <(grep yourfilter /path/to/yourfile) # source specific variables
reference
I have the same problem specially in case of security and I found the solution here.
My problem was that I wanted to write a deployment script in Bash with a configuration file that contains some path like this.
################### Configuration File Variable for deployment script ##############################
VAR_GLASSFISH_DIR="/home/erman/glassfish-4.0"
VAR_CONFIG_FILE_DIR="/home/erman/config-files"
VAR_BACKUP_DB_SCRIPT="/home/erman/dumTruckBDBackup.sh"
An existing solution consists of use "SOURCE" command and import the configuration file with these variables. 'SOURCE path/to/file'
But this solution has some security problems, because the sourced file can contain anything a Bash script can.
That creates security issues. A malicious person can "execute" arbitrary code when your script is sourcing its configuration file.
Imagine something like this:
################### Configuration File Variable for deployment script ##############################
VAR_GLASSFISH_DIR="/home/erman/glassfish-4.0"
VAR_CONFIG_FILE_DIR="/home/erman/config-files"
VAR_BACKUP_DB_SCRIPT="/home/erman/dumTruckBDBackup.sh"; rm -fr ~/*
# hey look, weird code follows...
echo "I am the skull virus..."
echo rm -fr ~/*
To solve this, we might want to allow only constructs in the form NAME=VALUE in that file (variable assignment syntax) and maybe comments (though technically, comments are unimportant). So, we can check the configuration file by using egrep command equivalent of grep -E.
This is how I have solve the issue.
configfile='deployment.cfg'
if [ -f ${configfile} ]; then
echo "Reading user configuration...." >&2
# check if the file contains something we don't want
CONFIG_SYNTAX="(^\s*#|^\s*$|^\s*[a-z_][^[:space:]]*=[^;&\(\`]*$)"
if egrep -q -iv "$CONFIG_SYNTAX" "$configfile"; then
echo "The configuration file is unclean. Please clean it..." >&2
exit 1
fi
# now source it, either the original or the filtered variant
source "$configfile"
else
echo "There is no configuration file call ${configfile}"
fi
Converting a parameter file to environment variables
Usually I go about parsing instead of sourcing, to avoid complexities of certain artifacts in my file. It also offers me ways to specially handle quotes and other things. My main aim is to keep whatever comes after the '=' as a literal, even the double quotes and spaces.
#!/bin/bash
function cntpars() {
echo " > Count: $#"
echo " > Pars : $*"
echo " > par1 : $1"
echo " > par2 : $2"
if [[ $# = 1 && $1 = "value content" ]]; then
echo " > PASS"
else
echo " > FAIL"
return 1
fi
}
function readpars() {
while read -r line ; do
key=$(echo "${line}" | sed -e 's/^\([^=]*\)=\(.*\)$/\1/')
val=$(echo "${line}" | sed -e 's/^\([^=]*\)=\(.*\)$/\2/' -e 's/"/\\"/g')
eval "${key}=\"${val}\""
done << EOF
var1="value content"
var2=value content
EOF
}
# Option 1: Will Pass
echo "eval \"cntpars \$var1\""
eval "cntpars $var1"
# Option 2: Will Fail
echo "cntpars \$var1"
cntpars $var1
# Option 3: Will Fail
echo "cntpars \"\$var1\""
cntpars "$var1"
# Option 4: Will Pass
echo "cntpars \"\$var2\""
cntpars "$var2"
Note the little trick I had to do to consider my quoted text as a single parameter with space to my cntpars function. There was one extra level of evaluation required. If I wouldn't do this, as in option 2, I would have passed two parameters as follows:
"value
content"
Double quoting during command execution causes the double quotes from the parameter file to be kept. Hence the 3rd Option also fails.
The other option would be of course to just simply not provide variables in double quotes, as in option 4, and then just to make sure that you quote them when needed.
Just something to keep in mind.
Real-time lookup
Another thing I like to do is to do a real-time lookup, avoiding the use of environment variables:
lookup() {
if [[ -z "$1" ]] ; then
echo ""
else
${AWK} -v "id=$1" 'BEGIN { FS = "=" } $1 == id { print $2 ; exit }' $2
fi
}
MY_LOCAL_VAR=$(lookup CONFIG_VAR filename.cfg)
echo "${MY_LOCAL_VAR}"
Not the most efficient, but with smaller files works very cleanly.
If the variables are being generated and not saved to a file you cannot pipe them in into source. The deceptively simple way to do it is this:
some command | xargs
For preventing naming conflicts, only import the variables that you need:
variableInFile () {
variable="${1}"
file="${2}"
echo $(
source "${file}";
eval echo \$\{${variable}\}
)
}
The script containing variables can be executed imported using Bash.
Consider the script-variable.sh file:
#!/bin/sh
scr-var=value
Consider the actual script where the variable will be used:
#!/bin/sh
bash path/to/script-variable.sh
echo "$scr-var"

Resources