Parsing function param in bash and pass it to new function - bash

I'm trying to build a bash function that takes two arguments, and executes the first in a docker container, then greps the second argument in the output..but I can't get it to work.
This is what I've come up so far:
function scriptCheckData() {
local cmd=eval "$1"
local RESULT=$(docker exec -i "$dockerName" cmd)
if ! echo "$RESULT" | grep "$2"; then
echo grep failed for "$2" in: "$RESULT"
cleanupAndExit
fi
}
Function call:
scriptCheckData 'test.php --option 1' 'expected string'
Any help appreciated!
EDIT:
Solution was to not put my var in quotes, like this:
function scriptCheckData() {
local RESULT=$(docker exec -i "$dockerName" $1)
if ! echo "$RESULT" | grep "$2"; then
echo "grep failed for [$2] in: [$RESULT]"
cleanupAndExit
fi
}

change this
local RESULT=$(docker exec -i "$dockerName" cmd)
with
local RESULT=$(docker exec -i "$dockerName" $cmd)
then I would suggest to use this sintax to improve output print
echo "grep failed for [$2] in: [$RESULT]"
Anyhow just to be sure that the function is really getting the parameter, you could add a debug in your script i.e.:
echo "cmd[$cmd]"
echo "param1[$1]"
echo "param2[$2]"
Regards
Claudio

Related

Unable to execute awk command in a function, but working directly in the shell

I want to create a utility function for bash to remove duplicate lines. I am using function
function remove_empty_lines() {
if ! command -v awk &> /dev/null
then
echo '[x] ERR: "awk" command not found'
return
fi
if [[ -z "$1" ]]
then
echo "usage: remove_empty_lines <file-name> [--replace]"
echo
echo "Arguments:"
echo -e "\t--replace\t (Optional) If not passed, the result will be redirected to stdout"
return
fi
if [[ ! -f "$1" ]]
then
echo "[x] ERR: \"$1\" file not found"
return
fi
echo $0
local CMD="awk '!seen[$0]++' $1"
if [[ "$2" = '--reload' ]]
then
CMD+=" > $1"
fi
echo $CMD
}
If I am running the main awk command directly, it is working. But when i execute the same $CMD in the function, I am getting this error
$ remove_empty_lines app.js
/bin/bash
awk '!x[/bin/bash]++' app.js
The original code is broken in several ways:
When used with --reload, it would truncate the output file's contents before awk could ever read those contents (see How can I use a file in a command and redirect output to the same file without truncating it?)
It didn't ever actually run the command, and for the reasons described in BashFAQ #50, storing a shell command in a string is inherently buggy (one can work around some of those issues with eval; BashFAQ #48 describes why doing so introduces security bugs).
It wrote error messages (and other "diagnostic content") to stdout instead of stderr; this means that if your function's output was redirected to a file, you could never see its errors -- they'd end up jumbled into the output.
Error cases were handled with a return even in cases where $? would be zero; this means that return itself would return a zero/successful/truthy status, not revealing to the caller that any error had taken place.
Presumably the reason you were storing your output in CMD was to be able to perform a redirection conditionally, but that can be done other ways: Below, we always create a file descriptor out_fd, but point it to either stdout (when called without --reload), or to a temporary file (if called with --reload); if-and-only-if awk succeeds, we then move the temporary file over the output file, thus replacing it as an atomic operation.
remove_empty_lines() {
local out_fd rc=0 tempfile=
command -v awk &>/dev/null || { echo '[x] ERR: "awk" command not found' >&2; return 1; }
if [[ -z "$1" ]]; then
printf '%b\n' >&2 \
'usage: remove_empty_lines <file-name> [--replace]' \
'' \
'Arguments:' \
'\t--replace\t(Optional) If not passed, the result will be redirected to stdout'
return 1
fi
[[ -f "$1" ]] || { echo "[x] ERR: \"$1\" file not found" >&2; return 1; }
if [ "$2" = --reload ]; then
tempfile=$(mktemp -t "$1.XXXXXX") || return
exec {out_fd}>"$tempfile" || { rc=$?; rm -f "$tempfile"; return "$rc"; }
else
exec {out_fd}>&1
fi
awk '!seen[$0]++' <"$1" >&$out_fd || { rc=$?; rm -f "$tempfile"; return "$rc"; }
exec {out_fd}>&- # close our file descriptor
if [[ $tempfile ]]; then
mv -- "$tempfile" "$1" || return
fi
}
First off the output from your function call is not an error but rather the output of two echo commands (echo $0 and echo $CMD).
And as Charles Duffy has pointed out ... at no point is the function actually running the $CMD.
As for the inclusion of /bin/bash in your function's echo output ... the main problem is the reference to $0; by definition $0 is the name of the running process, which in the case of a function is the shell under which the function is being called. Consider the following when run from a bash command prompt:
$ echo $0
-bash
As you can see from your output this generates /bin/bash in your environment. See this and this for more details.
On a related note, the reference to $0 within double quotes causes the $0 to be evaluated, so this:
local CMD="awk '!seen[$0]++' $1"
becomes
local CMD="awk '!seen[/bin/bash]++' app.js"
I'm thinking what you want is something like:
echo $1 # the name of the file to be processed
local CMD="awk '!seen[\$0]++' $1" # escape the '$' in '$0'
becomes
local CMD="awk '!seen[$0]++' app.js"
That should fix the issues shown in your function's output; as for the other issues ... you're getting a good bit of feedback in the various comments ...

concatenate a function with string and execute it

I wanna concatenate a command specified in a function with string and execute it after.
I will simplify my need with an exemple to execute "ls -l -a"
#!/bin/bash
echo -e "specify command"
read command # ls
echo -e "specify argument"
read arg # -l
test () {
$command $arg
}
eval 'test -a'
Except that
Use an array, like this:
args=()
read -r command
args+=( "$command" )
read -r arg
args+=( "$arg" )
"${args[#]}" -a
If you want a function, then you could do this:
run_with_extra_switch () {
"$#" -a
}
run_with_extra_switch "${args[#]}"
#!/bin/bash
echo -e "specify command"
read command # ls
echo -e "specify argument"
read arg # -l
# using variable
fun1 () {
line="$command $arg"
}
# call the function
fun1
# parameter expansion will expand to the command and execute
$line
# or using stdout (overhead)
fun2 () {
echo "$command $arg"
}
# process expansion will execute function in sub-shell and output will be expanded to a command and executed
$(fun2)
It will work for the given question however to understand how it works look at shell expansion and attention must be payed to execute arbitrary commands.
Before to execute the command, it can be prepended by printf '<%s>\n' for example to show what will be executed.

bash script not running properly

When I run this by its self in the command line it seems to work fine, but when I have another script execute this, it doesn't work. Any ideas? I'm guessing it has to do with quotes, but not sure.
#!/bin/sh
#Required csvquote from https://github.com/dbro/csvquote
#TODO: Clean CSV File using CSVFix
#Version 3
echo "File Name: $1"
function quit {
echo "Quitting Script"
exit 1
}
function fileExists {
if [ ! -f "$1" ]
then
echo "File $1 does not exists"
quit
fi
}
function getInfo {
#Returns website url like: "http://www.website.com/info"
#Reads last line of a csv file, and gets the 2nd item.
RETURN=$(tail -n 1 $1 | csvquote | cut -d ',' -f 2 | csvquote -u)
echo $RETURN
}
function work {
CURLURL="http://127.0.0.1:9200/cj/_query"
URL=$(getInfo)
echo "URL: $URL"
CURLDATA='{ "query" : { "match" : { "PROGRAMURL" : '$URL' } } }'
#URL shows up as blank...???
echo "Curl Data: $CURLDATA"
RESPONSE=$(curl -XDELETE "$CURLURL" -d "$CURLDATA" -vn)
echo $RESPONSE
echo "Sleeping Allowing Time To Delete"
sleep 5s
}
fileExists $1
work $1
I cant see why a simpler version wont work: functions are useful, but I think there are too many, overcomplicating things, if what you are posting is the entirety of your script (in my opinion)
Your script is doing things using a broken lucky pattern: $1 variables are also arguments to shell functions as well as the main script. Think of them as local variables to a function. So when you are calling $(getInfo) it is calling that function with no argument, so actually runs tail -n 1 which falls back to stdin, which you are specifying to work as < $1. You could see this for yourself by putting echo getInfo_arg_1="$1" >&2 inside the function...
Note also you are not quoting $1 anywhere, this script is not whitespace in file safe, although this is only more likely to be a problem if you are having to deal with files sent to you from a Windows computer.
In the absence of other information, the following 'should' work:
#!/bin/bash
test -z "$1" && { echo "Please specify a file." ; exit 1; }
test -f "$1" || { echo "Cant see file '$1'." ; exit 1; }
FILE="$1"
function getInfo() {
#Returns website url like: "http://www.website.com/info"
#Reads last line of a csv file, and gets the 2nd item.
tail -n 1 "$1" | csvquote | cut -d ',' -f 2 | csvquote -u
}
CURLURL="http://127.0.0.1:9200/cj/_query"
URL=$(getInfo "$FILE")
echo "URL: $URL"
CURLDATA='{ "query" : { "match" : { "PROGRAMURL" : '$URL' } } }'
curl -XDELETE "$CURLURL" -d "$CURLDATA" -vn
echo "Sleeping Allowing Time To Delete"
sleep 5s
If it still fails you really need to post your error messages.
One other thing, especially if you are calling this from another script, chmod +x the script so you can run it without having to invoke it with bash directly. If you want to turn on debugging then put set -x near the start somewhere.

escaping sh/bash function arguments

I want to submit multiple commands with arguments to shell functions, and thus quote my commands like his:
$ CMD=''\''CMD'\'' '\''ARG1 ARG2 ARG3'\'''
$ echo $CMD
'CMD' 'ARG1 ARG2 ARG3' 'ARG4'
Now when I try to us them in a function like this:
$ function execute { echo "$1"; echo "$2"; echo "$3"; }
I get the result:
$ execute $CMD
'CMD'
'ARG1
ARG2
How can I get to this result:
$ execute $CMD
CMD
ARG1 AGR2 ARG3
Thanks in advance!
PS: I use an unquoting function like:
function unquote { echo "$1" | xargs echo; }
EDIT:
to make my intentions more clear: I want to gradually build up a command that needs arguments with spaces passed to subfunctions:
$ CMD='HOST '\''HOSTNAME'\'' '\''sh SCRIPTNAME'\'' '\''MOVE '\''\'\'''\''/path/to/DIR1'\''\'\'''\'' '\''\'\'''\''/path/to/DIR2'\''\'\'''\'''\'''
$ function execute { echo "$1 : $2 : $3 : $4"; }
$ execute $CMD
HOST : 'HOSTNAME' : 'sh : SCRIPTNAME'
The third arguments breaks unexpected at a space, the quoting is ignored. ??
Use an array and # in double quotes:
function execute () {
echo "$1"
echo "$2"
echo "$3"
}
CMD=('CMD' 'ARG1 ARG2 ARG3' 'ARG4')
execute "${CMD[#]}"
function execute {
while [[ $# > 0 ]]; do
cmd=$(cut -d' ' -f1 <<< $1)
arg=$(sed 's/[^ ]* //' <<< $1)
echo "$cmd receives $arg"
shift
done
}
CMD1="CMD1 ARG11 ARG12 ARG13"
CMD2="CMD2 ARG21 ARG22 ARG23"
execute "$CMD1" "$CMD2"
Gives:
CMD1 receives ARG11 ARG12 ARG13
CMD2 receives ARG21 ARG22 ARG23

Bash function argument returns error "command not found"

I have this function in a bash script, to create a new jekyll post; but it returns the argument as command not found. Here's the script:
function new_post () {
if [ -z "$1" ]
then
read -p "Post Title:" TITLE
else
TITLE= "$1"
fi
FILE=$( echo $TITLE | tr A-Z a-z | tr ' ' _ )
echo -e '---\nlayout: post\ntitle: '$TITLE'\npublished: false\n---\n' > $(date '+%Y-%m-%d-')"$FILE"'.md'
}
But whenever I try to run it it returns:
$>new_post "Hello World"
-bash: Hello World: command not found
It appears to be trying to run the argument as a command.
I even tried this and got the same result
$>TITLE= "Hello World" && echo -e ---layout: post\ntitle: "$TITLE"\n---
-bash: Hello World: command not found
Can anybody tell me what I'm doing wrong?
It may be the space in TITLE= "$1" that causes the error. Try with TITLE="$1"
In my case:
echo "Deploy of `$1` to `$2` project? (Y/N)"
the issue was also present. When I removed [``] it's gone. Not sure if you pasted a complete script but beware double quotes for args.
Similar answer https://askubuntu.com/questions/180320/bash-script-program-with-parameters-as-a-single-variable-command-not-found

Resources