How to pass spaces in arguments to a command in a variable - bash

I want to put a bash command to be executed in a variable, and then execute that variable. The command has arguments, and some of those arguments might include spaces in them. How can I do that?
What I have tried so far is something like this:
nbargs(){ echo $#; } # define some function
command='nbargs 1 "2 3"' # I want nbargs to recieve 2 args, not 3
Now if one invoke the command directly, it works as expected. But indirectly, through the variable, it doesn't.
nbargs 1 "2 3" # output: 2
echo $command # output: nbargs 1 "2 3"
$command # output: 3 ???
How can I solve my problem and can you explain why executing the variable does not take into account the quotes?

If you want to store full command line to call your function then you should use shell arrays:
cmd=(nbargs 1 "2 3")
and call it as:
"${cmd[#]}"
This will correctly output:
2
Don't use variable name command as that is also a shell utility.
Storing full command like in a string variable is error prone esp. when you have whitespaces, quotes etc.

Related

A '*' argument results in "value too great for base" error

I was making this really simple arithmetic script but got this odd error when I try to pass an * operator.
Syntax: [command] [num1] [num2] [arithmetic operator]
Here is the code:
result=$(($1$3$2))
echo "Calculation result is: $result "
Passing a '*' operator like this:
bash my_script.sh 1 2 *
... returns the following error:
line 7: 3Access: value too great for base (error token is "3Access")
I botched together a fix for it by replacing * with \\\* through a test statement. Though I would like to understand why this error occurs.
The only threads I found refer to the error regarding above-octal values being assumed by BASH to be octal. But it's unclear to me why * is being looked at as a numerical value at all.
The * has a special meaning in the shell language. It is used by filename expansion and expands to all non hidden files and folders in the current directory and passes them as individual arguments to the script.
Just try this script to see:
#!/bin/bash
# star.sh
echo "$1"
echo "$2"
echo "$3"
# and so on ...
Now run it:
bash star.sh *
It will print the first 3 files in your current directory.
To avoid filename expansion to happen you need to quote the *. Like this:
bash your_script.sh 1 2 '*'
or escape it
bash your_script.sh 1 2 \*
See Bash Manual: Filename Expansion.

shell programming - how to properly pass variables within a function

What's wrong with my bash script? I'm trying to pass positional parameters within a function. My last test - Test 4 works but its basically the command that I would run on the command line, no variable substitution.
I would like to call my function. Can someone tell me if the construction of my first 3 tests are valid and how to I can correct them? Thanks!
To execute: ./myscript.sh dev01 tester
#!/bin/bash
set +x
if [[ $# != 2 ]]; then
echo "Usage: ./script.sh <ENV> <COMPONENT>"
exit 1
fi
# Setup VARS
CREDS="-x foobar -a ec2.local_ipv4"
ENVIRONMENT="$1"
ROLES="$2"
function deploy {
knife ssh "$CREDS" "chef_environment:"$ENVIRONMENT" AND roles:*"$ROLES"*" "uname"
}
echo "Test 1"
deploy
echo "Test 2"
DEPLOY=$(knife ssh "$CREDS" "chef_environment:"${ENVIRONMENT}" AND roles:*"${ROLES}"*" "uname")
$DEPLOY
echo "Test 3"
knife ssh "$CREDS" "chef_environment:"$ENVIRONMENT" AND roles:*"$ROLES"*" "uname"
echo "Test 4"
knife ssh -x foobar -a ec2.local_ipv4 "chef_environment:dev01 AND roles:*tester*" "uname"
Again, Test 4 works only.
Your problem is unrelated to using a function; it has to do with how you're storing arguments in a variable and using that variable later:
If you want to store multiple arguments in a (non-array) variable, you cannot reference that variable double-quoted, because the value is then passed as a single argument to the target utility.
An immediate fix would be to use $CREDS unquoted, but that makes the value subject to potentially unwanted shell expansions, so the robust way to pass multiple arguments is to use an array:
# Store args. individually as array elements
CREDS=( '-x' 'foobar' '-a' 'ec2.local_ipv4' )
# ...
# "${CREDS[#]}" passes the elements of the array safely as *individual*
# arguments.
knife ssh "${CREDS[#]}" "chef_environment:$ENVIRONMENT AND roles:*$ROLES*" "uname"
Also note how I've embedded the $ENVIRONMENT and $ROLES variable references directly in the double-quoted string, which also makes the command more robust.
Finally, it's better not to use all-uppercase shell-variable names in order to avoid conflicts with environment variables and special shell variables.

How to know how many parameters is a bash script accepting. I dont have permission to view/edit the script

I have a bash script wherein I have to provide parameters. But I don't know how many parameters to be supplied while executing the script.
I don't have permission to view/edit the script to know how many parameters the script is accepting.
Please let me know if there is any way to know how many parameters a script can accept without view/edit the script.
Perhaps you can try supplying a variable number of arguments and seeing which one will work. To generate a script that creates variable args you could hack up a script like this:
#!/bin/bash
# will send upto 5 command line arguments
for i in `seq 5`
do
# replace newlines with spaces
j=`seq $i | tr '\n' ' '`
echo foo $j
done
It will result in something like:
foo 1
foo 1 2
foo 1 2 3
foo 1 2 3 4
foo 1 2 3 4 5
where 'foo' is the name of the program you need to run.
Note that method this may only work if the script does not require any command line switches.

retaining quotes in bash correctly

I am trying to pass arguments from a bash script to an executable and one of them contains spaces. I have been searching how to solve this, but I cannot find the right way to do it. Minimal example with a script called first and a script called second.
first script:
#!/bin/bash
# first script
ARGS="$#"
./second $ARGS
second script:
#!/bin/bash
# second script
echo "got $# arguments"
Now if I run it like this, I get the following results:
$ ./first abc def
got 2 args
$ ./first "abc def"
got 2 args
$ ./first 'abc def'
got 2 args
How can I make it so, that the second script also only receives one argument?
You can't do it using an intermediate variable. If you quote it will always pass 1 argument, if you don't you will lose the quotes.
However, you can pass the arguments directly if you don't use the variable like this:
./second "$#"
$ ./first abc def
got 2 arguments
$ ./first "abc def"
got 1 arguments
Alternately, you can use an array to store the arguments like this:
#!/bin/bash
# first script
ARGS=("$#")
./second "${ARGS[#]}"
IFS is your friend .
#!/bin/bash
# first script
ARGS="$#"
IFS=$(echo -en "\n\b")
./second $ARGS
IFS stands for Internal Field Separator ...

how to pass file as an argument to the script file

I have a shell script written in bash and this script should take file as an argument,can any one tell me how to write script for this any ideas on this are apprecited
Thanks,
You can access the command line arguments passed to your script using positional parameters.
Also to check if the right number of arguments have been passed to the script, you can make use of the variable $# which holds the number of arguments passed.
if [ $# -eq 1 ]; then
# exactly 1 argument was passed..use it..its available in $1
echo "Argument $1"
else
# either 0 or >1 arguments were passed...error out.
echo "Incorrect number of arguments passed"
exit 1
fi
Sample run:
$ bash a.sh
Incorrect number of arguments passed
$ bash a.sh foo
Argument foo
$ bash a.sh foo bar
Incorrect number of arguments passed
$
If you need to operate on the file, you can take the name of the file as an argument and just use the file with the specified name.
If you just need to read the contents of the file, you can use redirection to have the script read the contents of the file on standard in. You can do this using ./script < inputfile

Resources