Substituting variable inside a string stored in array - bash - bash

My strings are stored inside array, a sample string contains a 'variable'.
While iterating over this array, I want to substitute the 'variable' with a 'value'.
This fails for me. I have tried my lot in various ways & googled but could not figure it out
# Array of strings (each string is a command)
clean_aa_commands=(
"sourceanalyzer -b ${FortifyBuildId} -clean"
"cd ${unifiedbuilddir}/AA/AAUI"
"mvn clean"
)
# Functions
function check {
if [ "$?" -ne 0 ]; then
echo "Operation [$1] Unsuccessful!"
else
echo "Operation [$1] Success!"
fi
}
function runcmds {
echo "using $FortifyBuildId *************************"
cmdArr=("${!1}")
for cmd in "${cmdArr[#]}"
do
echo "-->Running [$cmd]"
eval "$cmd"
check "$cmd"
echo ""
echo ""
done
}
# main
FortifyBuildId="$1"
echo "FortifyBuildId is $FortifyBuildId"
unifiedbuilddir=`pwd`
runcmds clean_aa_commands["#"]

How about this
cmds='cmds=( /test/$mydir /test/$test )'
mydir=$(pwd)
test="me"
eval ${cmds}
echo ${cmds[#]}

Related

condition evaluating to "TRUE"

I am trying to check for duplicate records in my database using shell scripting.
For this, I have created a function named "check()" which echo's True or False and is stored in variable "result". But while evaluating using if statement it is always returning "True".
#redundancy check function
check() {
temp=$(grep -w -c "$1" database.dat)
echo $temp
if [ "$temp" != 0 ]
then
echo True
else
echo False
fi
}
insert() {
option="y"
while [ "$option" == "y" ]
do
echo "Rollno: \c"
read roll
result="$(check $roll)"
echo $result
if [ "$result" == "False" ]
then
echo Do something
else
echo "ERROR: Duplicate record found...\nEXITING...\n"
option="n"
fi
done
}
If you're using a shell that doesn't support the == extension to test, then your tests will always, unconditionally fail simply on account of invalid syntax. Use = for string comparisons to be portable to all POSIX-compliant implementations.
Moreover, there's no point to storing and then comparing the output from grep at all: Use the exit status of grep -q when your only goal is to check whether the number of matches is zero or more-than-zero; this allows grep to exit immediately when a match is seen, rather than needing to read the rest of the file.
# with -q, this emits no stdout, but exits w/ status 0 (matches exist) or 1 (otherwise)
check() { grep -q -w -e "$1" database.dat; }
insert() {
option=y
while [ "$option" = y ]; do
printf '%b\n' "Rollno: \c"
read -r roll
if check "$roll"; then
printf "ERROR: Duplicate record found...\nEXITING...\n"
option=n
else
echo "Check failed; do something"
fi
done
}

Get the output and return value from another bash function

In function_two, I need to get both the output from echo and the return value from function_one
#!/bin/bash
function_one() {
echo "okay then"
return 2
}
function_two() {
local a_function="function_one"
local string=`${a_function}`
echo $string # echos "okay then"
echo "$?" #echos 0 - how do we get the returned value of 2?
}
function_two
When trying echo "$?" I get 0 instead of 2
Update
As Ipor Sircer pointed out, $? above is giving the exit code of the previous command echo $string
So instead I grab the exit code immediately after. And as choroba mentioned, the localization and assignment of the variable needed to be separated.
Here is the working script:
#!/bin/bash
function_one() {
echo "okay then"
return 2
}
function_two() {
local a_function="function_one"
local string
string=`${a_function}`
local exitcode=$?
echo "string: $string" # okay then
echo "exitcode: $exitcode" # 2
}
function_two
0 is the exit status of the last command executed, i.e. echo $string.
If you need to use the exit status later, store it in a variable:
local string
string=`${a_function}`
local status=$?
echo "Output: $string"
echo "Status: $status"
You also need to separate the assignment and localization of the variable to not get the status of local instead.

Bash function arguments not passed as expected

I have an unexpected behaviour in my script:
#!/bin/bash
checkFolders "first folder"
checkFolders() {
checkEmptyVar $1 "folder to be copied"
}
checkEmptyVar() {
echo "0: $0 1: $1 2: $2"
[[ $(isNotEmpty $1) = false ]] && (echo "Specify a path for $2") && exit
echo "post exit"
}
The function checkEmptyVar echoes the following:
0: ./lcp.sh 1: folder to be copied 2:
I expected to have passed "folder-to-be-copied" as $1of checkEmptyVar, what is happening?
You have numerous problems:
$0 is not the first argument to a function; $1 is. $0 is the name of the current script.
You must quote parameter expansions to prevent them from being split into multiple words on embedded whitespace.
Functions must be defined before they are used.
The correct script is
#!/bin/bash
checkFolders() {
checkEmptyVar "$1" "folder to be copied"
}
checkEmptyVar() {
[[ $(isNotEmpty "$1") = false ]] && echo "Specify a path for $2" && exit
echo "post exit"
}
checkFolders "first folder"
Further, it would be better to have isNotEmpty return a non-zero value instead of outputting the string false, so that you could write
checkEmptyVar () {
if isNotEmpty "$1"; then
echo "Specify a path for $2" >&2 # Use standard error, not standard output
exit 1
fi
echo "post exit"
}
(I suspect you could replace isNotEmpty with [[ -n $1 ]] anyway.)
This script isn't doing what you think it is. Your function definitions are happening too late.
When you call checkFolders on the first line you are calling a version of that function from the pre-existing environment and not the one defined later in that script.
If you run command -V checkFolders from the shell you are running this script from I expect you'll get output somewhat like:
checkFolders is a function
checkFolder ()
{
checkEmptyVar "folder to be copied"
}
though anything is possible there.
You should also always quote variables when you use them to prevent the shell from word-splitting their contents.
Without doing that when you call checkFolders "first folder" that ends up calling checkFolders first folder "folder to be copied" which isn't what you want at all.
Invert the order of your functions and calls and fix your variable quoting and you should see what you expect.

Bash function not returning value properly

I don't know what is wrong with my function; it is not returning value properly.
function validate_directory_isempty {
retval=""
NumOfFiles=`ls -l $input | egrep -c "^-"`
if [ "$NumOfFiles" == "0" ];then
retval=true
else
retval=false
fi
echo $retval
}
retval=$(validate_directory_isempty /opt/InstallationManager)
echo "retval : " $retval
if [ "$retval" == "true" ]; then
echo ""
echo "Installing Installation Manager"
# Install_IM
else
echo ""
echo "DIRECTORY is not empty. Please make sure install location $DIRECTORY is empty before installing PRODUCT"
fi
The idiomatic way to have a function return true or false is to use the return keyword. A return value of 0 means success, a non-zero value means failure. Also note that a function will return with a status of the last command executed if return is not present.
I would write your function like this
is_dir_empty() {
shopt -s nullglob
local -a files=( "$1"/* )
(( ${#files[#]} == 0 ))
}
directory=/opt/InstallManager
if is_dir_empty "$directory"; then
echo "$directory" is empty
fi
The first line sets a shell option that pattern matching no files expands to null instead of the pattern itself as a string.
The second line fills an array with the filenames in the given directory.
The last line tests the number of elements in the array. If zero entries, return success else return failure.
i just replaced my script as below and it worked
removed the below commands
retval=$(validate_directory_isempty /opt/InstallationManager)
echo "retval : " $retval
added
input=/opt/InstallationManager
validate_directory_isempty
and it worked.
Thanks again for your valuable inputs

Can I pass an arbitrary block of commands to a bash function?

I am working on a bash script where I need to conditionally execute some things if a particular file exists. This is happening multiple times, so I abstracted the following function:
function conditional-do {
if [ -f $1 ]
then
echo "Doing stuff"
$2
else
echo "File doesn't exist!"
end
}
Now, when I want to execute this, I do something like:
function exec-stuff {
echo "do some command"
echo "do another command"
}
conditional-do /path/to/file exec-stuff
The problem is, I am bothered that I am defining 2 things: the function of a group of commands to execute, and then invoking my first function.
I would like to pass this block of commands (often 2 or more) directly to "conditional-do" in a clean manner, but I have no idea how this is doable (or if it is even possible)... does anyone have any ideas?
Note, I need it to be a readable solution... otherwise I would rather stick with what I have.
This should be readable to most C programmers:
function file_exists {
if ( [ -e $1 ] ) then
echo "Doing stuff"
else
echo "File $1 doesn't exist"
false
fi
}
file_exists filename && (
echo "Do your stuff..."
)
or the one-liner
file_exists filename && echo "Do your stuff..."
Now, if you really want the code to be run from the function, this is how you can do that:
function file_exists {
if ( [ -e $1 ] ) then
echo "Doing stuff"
shift
$*
else
echo "File $1 doesn't exist"
false
fi
}
file_exists filename echo "Do your stuff..."
I don't like that solution though, because you will eventually end up doing escaping of the command string.
EDIT: Changed "eval $*" to $ *. Eval is not required, actually. As is common with bash scripts, it was written when I had had a couple of beers ;-)
One (possibly-hack) solution is to store the separate functions as separate scripts altogether.
The cannonical answer:
[ -f $filename ] && echo "it has worked!"
or you can wrap it up if you really want to:
function file-exists {
[ "$1" ] && [ -f $1 ]
}
file-exists $filename && echo "It has worked"

Resources