What's wrong with my bash script? I'm trying to pass positional parameters within a function. My last test - Test 4 works but its basically the command that I would run on the command line, no variable substitution.
I would like to call my function. Can someone tell me if the construction of my first 3 tests are valid and how to I can correct them? Thanks!
To execute: ./myscript.sh dev01 tester
#!/bin/bash
set +x
if [[ $# != 2 ]]; then
echo "Usage: ./script.sh <ENV> <COMPONENT>"
exit 1
fi
# Setup VARS
CREDS="-x foobar -a ec2.local_ipv4"
ENVIRONMENT="$1"
ROLES="$2"
function deploy {
knife ssh "$CREDS" "chef_environment:"$ENVIRONMENT" AND roles:*"$ROLES"*" "uname"
}
echo "Test 1"
deploy
echo "Test 2"
DEPLOY=$(knife ssh "$CREDS" "chef_environment:"${ENVIRONMENT}" AND roles:*"${ROLES}"*" "uname")
$DEPLOY
echo "Test 3"
knife ssh "$CREDS" "chef_environment:"$ENVIRONMENT" AND roles:*"$ROLES"*" "uname"
echo "Test 4"
knife ssh -x foobar -a ec2.local_ipv4 "chef_environment:dev01 AND roles:*tester*" "uname"
Again, Test 4 works only.
Your problem is unrelated to using a function; it has to do with how you're storing arguments in a variable and using that variable later:
If you want to store multiple arguments in a (non-array) variable, you cannot reference that variable double-quoted, because the value is then passed as a single argument to the target utility.
An immediate fix would be to use $CREDS unquoted, but that makes the value subject to potentially unwanted shell expansions, so the robust way to pass multiple arguments is to use an array:
# Store args. individually as array elements
CREDS=( '-x' 'foobar' '-a' 'ec2.local_ipv4' )
# ...
# "${CREDS[#]}" passes the elements of the array safely as *individual*
# arguments.
knife ssh "${CREDS[#]}" "chef_environment:$ENVIRONMENT AND roles:*$ROLES*" "uname"
Also note how I've embedded the $ENVIRONMENT and $ROLES variable references directly in the double-quoted string, which also makes the command more robust.
Finally, it's better not to use all-uppercase shell-variable names in order to avoid conflicts with environment variables and special shell variables.
Related
Currently I have a script that does some extra processing, but ultimately calls the command the user passed (FYI, this is to run some commands in a docker container, but we'll call it foo.sh):
#!/usr/bin/env bash
# ...
runner "$#"
This works great (e.g. foo.sh echo HI), until the users wants to pass multiple commands to be run:
e.g.: foo.sh echo HI && echo BYE
&& is of course interpreted by the ambient shell before being passed into arguments.
Is there a workaround or means of escaping && that might work?
An idiom that often comes in handy for this kind of case:
cmds_q='true'
add_command() {
local new_cmd
printf -v new_cmd '%q ' "$#"
cmds_q+=" && $new_cmd"
}
add_command echo HI
add_command echo BYE
runner bash -c "$cmds_q"
The big advantage here is that add_command can be called with arbitrary arguments (after the first one defining the command to run, of course) with no risk of those arguments being parsed as syntax / used in injection attacks, so long as the caller never directly modifies cmds_q.
I have two shell scripts, i.e. (master, function); the master calls the function-script and tries to pass values to it.
Please note that function-script is an interactive script; i.e. it waits for user's answers to perform according to the answer.
So to pass one value I can write the following:
echo "string" | ./function-script
The problem is that I have to pass several values. Any advice?
Can the "function-script" operate on positional parameters? If so, you'd call it like:
./function-script arg1 "argument 2" arg3
And then "function-script" would use "$1", "$2" and "$3" as required.
If "function-script" only takes input on stdin, do something like this:
printf "%s\n" arg1 "argument 2" arg3 | ./function-script
And "function-script" would do:
IFS= read -r arg1
IFS= read -r arg2
IFS= read -r arg3
Simple solution:
Don't try to pass multiple variable.
Just export all the variable within master script using export a=1 syntax.
Then call child script from master like a regular script
All the variable will be available in child script.
Use command line arguments.
./function-script "string" "another string"
If you pre-empt standard input by piping data into the function script, you make interactive operation of the function script hard.
You could instead export the variables as environment variables, but just as global variables in regular programming are not a good idea because their use is hidden, so too with environment variables.
I have a script foo.sh
CMD='export FOO="BAR"'
$CMD
echo $FOO
It works as expected
>./foo.sh
"BAR"
Now I want to change FOO variable to BAR BAR. So I get script
CMD='export FOO="BAR BAR"'
$CMD
echo $FOO
When I run it I expect to get "BAR BAR", but I get
./foo.sh: line 2: export: `BAR"': not a valid identifier
"BAR
How I can deal with that?
You should not use a variable as a command by just calling it (like in your $CMD). Instead, use eval to evaluate a command stored in a variable. Only by doing this, a true evaluation step with all the shell logic is performed:
eval "$CMD"
(And use double quotes to pass the command to eval.)
Just don't do that.
And read Bash FAQ #50
I'm trying to save a command so I can run it later without having to repeat it each time
If you want to put a command in a container for later use, use a
function. Variables hold data, functions hold code.
pingMe() {
ping -q -c1 "$HOSTNAME"
}
[...]
if pingMe; then ..
The proper way to do that is to use an array instead:
CMD=(export FOO="BAR BAR")
"${CMD[#]}"
I would like to execute a command which is given by a variable (Variable cmd in this example):
cmd="echo 'First argument'"
$cmd
Expected result would be:
First argument
BUT ... actual result is:
'First argument'
What? I don't understand why I can see single quotes in the output. After all, if the command (=content of variable $cmd) would be issued directly, then no quotes leak into the output, it behaves as desired:
$ echo 'First argument'
First argument
To illustrate what I am trying to achieve in real life: in my deploy script there is a code block like this (strongly simplified, but you get the point):
#!/bin/bash
function execute {
cmd=$1
echo "Executing $cmd ..."
# execute the command:
$cmd
}
VERSION=1.0.2
execute "git tag -a 'release-$VERSION'"
Now, Git would create a tag which contains single quotes:
git tag
'1.0.2'
which is not what I want ...
What to do?
(Bash version: GNU bash 3.1.0)
(I found a very similar issue, here, but the answer would not apply to my problem)
cmd="echo 'First argument'"
$cmd
What happens there is word splitting and the actual resulting command is:
echo "'First" "argument'"
Double-parsing with the single quotes inside would never happen.
Also, it's better to use arrays:
#!/bin/bash
function execute {
cmd=("$#") ## $# is already similar to an array and storing it to another is just optional.
echo "Executing ${cmd[*]} ..."
# execute the command:
"${cmd[#]}"
}
VERSION=1.0.2
execute git tag -a "release-$VERSION"
For eval is a difficult choice in that kind of situation. You may not only get unexpected parsing results, but also unexpectedly run dangerous commands.
I think this is what you want:
cmd="echo 'First arg'"
eval $cmd
First arg
This is what I am trying to do...
#!/bin/bash
array_local=(1 2 3 4 5)
ssh user#server << EOF
index_remote=1
echo \$index_remote
echo \${array_local[\$index_remote]}
EOF
When I try to run the above script I get the O/P as 1 and a null value (blank space). I wanted ${array_local[$index_remote} value to be 2 instead of null, I need to access this local array using remote variable for my further work in the script..
<<EOF results variable expansion happening on the local machine, but you only defined the variable i on the remote machine. You need to think carefully about where you want to do the expansion. You haven't explained in your question whether the value of i is defined client-side or server-side, but I'm guessing from your subsequent comments that you want it done server-side. In that case you'll need to pass the array contents over ssh, which requires careful quoting:
ssh hostname#server <<EOF
i=1
eval `typeset -p array_local`
echo \${array_local[\$i]}
EOF
typeset -p array_local will output the string
declare -a array_local='([0]="1" [1]="2" [2]="3" [3]="4" [4]="5")'
Since this is inside backticks, it will get expanded client-side within the EOF-delimited heredoc, and then evaluated server-side by the eval. In other words it's equivalent to:
ssh hostname#server <<'EOF'
i=1
declare -a array_local='([0]="1" [1]="2" [2]="3" [3]="4" [4]="5")'
echo ${array_local[$i]}
EOF
Notice the difference in EOF quoting between the two examples. The first one allows parameter and shell expansion, and the second doesn't. That's why the echo line in the first one needs quoting, to ensure that parameter expansion happens server-side not client-side.