Bash Script to read all inputs on command line - bash

Say I let a user input as many arguments as he wants, how do read all the inputs?
So if a user typed in asdf asfd asdf, it should say 3 arguments
Right now I have
#!/bin/bash
read $#
echo "There are $# arguments"
but whenever I type in anything it always equals 0

If you want to read all the parameters from a variable using the same style as if you would have done it from the command line. You could do it in a function
#!/bin/bash
function param_count() {
for arg in $#
do
echo " arg: $arg"
done
echo "arg count= $#"
}
read foo
param_count $foo

If you really want to read a line as a sequence of words, your best bet in bash is to read it into an array:
$ read -a words
And now a few words from our sponsor
$ echo ${#words[#]}
8
$ echo "${words[3]}"
few
But that's not the way to pass arguments to a shell script. read just splits lines into words using whitespace (or whatever IFS is set to.) It does not:
Handle quotes
Allow the insertion of shell variables
Expand pathname patterns
Passing arguments on the command line lets you do all of the above, making the utility more convenient to use, and does not require any extra work to extract the arguments, making the utility easier to write.

This is the way to get it:
read ARGUMENTS
set -- $ARGUMENTS
echo "There are $# arguments"
Explanation:
read ARGUMENTS
Read the input and save it into ARGUMENTS
set -- $ARGUMENTS
Set the positional parameters to the given input
At this point, you can use that input as if it was given in the command-line.
e.g.:
echo "$1"
echo "$2"
...

Delete read $# line.
Then launch your script with arguments. Just like this:
/path/to/script arg1 arg2 arg3 4 5 6
The output should be:
There are 6 arguments

Here is an example script that shows you how to get the argument count, access the individual arguments and read in a new variable:
#!/usr/bin/env bash
echo "There are $# arguments, they are:"
for arg in $#
do
echo " - $arg"
done
# Access the aguments
echo "The first argument is $1"
echo "The second argument is $2"
echo "The third argument is $3"
echo "All arguments: $#"
# Read a variable
read var
echo "Some var: $var"

Related

Why the echo failed?

I want to create many new bash scripts through this bash. Here is my code.
#!/bin/bash
# create bash shell automatically
for file in "$*"
do
if [ ! -e "$file" ]
then
touch $file
chmod u+x $file
echo "#!/bin/bash" >> $file
echo "# " >> $file
echo "create success"
fi
done
if [ $# \> 1 ]
then
echo "$# shell files are created!"
else
echo "$# shell is created!"
fi
When I run this script like this:
./create_shell test1 test2 test3
the terminal said:
"line9:ambiguous redirect"
"line10:ambiguous redirect"
What does that mean?
The problem came from the use of $*, whether you quote or unquote it. The correct way to iterate through positional parameter is using "$#", notice the double quote.
for file in "$#"; do
: ...
done
or even POSIXly:
for file do
: ...
done
You got the error message from bash, because bash see the content of $file was not expanded to one word, it was expanded to three separated words test1, test2, test3. Try:
a='1 2 3'
echo 1 >> $a
to see what will happen.
Not to mention that you spoil the sole reason for using "$#". In order for this construct to work, you must also put double quotes around the variables that derive from it, if you don't want the split+glob operators to be invoked. Leaving variables unquote will lead to many security implications.
The '$*' is a string not the parameter array. You can do it like this
fileArr=($*)
for file in ${fileArr[#]}
do you things
The quotes around "$*" will prevent bash from parsing the passed parameter into individual tokens. The loop will be executed once with all of the parameters passed (i.e. if you ran test.sh one two three then the echo command will try to redirect to "one two three"). Just remove the quotes.

How to call an application in Bash using an argument with spaces

I have a bash file which is passed arguments which contain spaces. The bash file looks like:
#!/bin/bash
another_app "$1"
However, instead of processing the argument as a single argument, as I believe it should, it processes it as a number of arguments depending on how many spaces. For example, if I call my bash file such:
my_app "A Number Of Words"
Then the "another_app" application gets passed 4 different arguments, instead of one. How can I just pass the single argument through to the second application?
The others are correct it will depend somewhat on how the 2nd app handles the args. You can also have a little control as to how the args are passed. You can do this with some quoting or using the "$#" var as mentioned by #steve
For example app1.sh
#!/bin/bash
echo "Argument with no quotes"
./another_app.sh $1
echo "Argument with quotes"
./another_app.sh "$1"
echo "Argument with \$#"
./another_app.sh "$#"
and another_app.sh
#!/bin/bash
echo "Inside $0"
echo "Number of args passed to me: $#"
for X in "${#}"
do
echo $X
done
echo "Exiting $0"
Call the second application using "$#":
#!/bin/bash
another_app "$#"

Capturing verbatim command line (including quotes!) to call inside script

I'm trying to write a "phone home" script, which will log the exact command line (including any single or double quotes used) into a MySQL database. As a backend, I have a cgi script which wraps the database. The scripts themselves call curl on the cgi script and include as parameters various arguments, including the verbatim command line.
Obviously I have quite a variety of quote escaping to do here and I'm already stuck at the bash stage. At the moment, I can't even get bash to print verbatim the arguments provided:
Desired output:
$ ./caller.sh -f -hello -q "blah"
-f hello -q "blah"
Using echo:
caller.sh:
echo "$#"
gives:
$ ./caller.sh -f -hello -q "blah"
-f hello -q blah
(I also tried echo $# and echo $*)
Using printf %q:
caller.sh:
printf %q $#
printf "\n"
gives:
$ ./caller.sh -f hello -q "blah"
-fhello-qblah
(I also tried print %q "$#")
I would welcome not only help to fix my bash problem, but any more general advice on implementing this "phone home" in a tidier way!
There is no possible way you can write caller.sh to distinguish between these two commands invoked on the shell:
./caller.sh -f -hello -q "blah"
./caller.sh -f -hello -q blah
There are exactly equivalent.
If you want to make sure the command receives special characters, surround the argument with single quotes:
./caller.sh -f -hello -q '"blah"'
Or if you want to pass just one argument to caller.sh:
./caller.sh '-f -hello -q "blah"'
You can get this info from the shell history:
function myhack {
line=$(history 1)
line=${line#* }
echo "You wrote: $line"
}
alias myhack='myhack #'
Which works as you describe:
$ myhack --args="stuff" * {1..10} $PATH
You wrote: myhack --args="stuff" * {1..10} $PATH
However, quoting is just the user's way of telling the shell how to construct the program's argument array. Asking to log how the user quotes their arguments is like asking to log how hard the user punched the keys and what they were wearing at the time.
To log a shell command line which unambiguously captures all of the arguments provided, you don't need any interactive shell hacks:
#!/bin/bash
line=$(printf "%q " "$#")
echo "What you wrote would have been indistinguishable from: $line"
I understand you want to capture the arguments given by the caller.
Firstly, quotes used by the caller are used to protect during the interpretation of the call. But they do not exist as argument.
An example: If someone call your script with one argument "Hello World!" with two spaces between Hello and World. Then you have to protect ALWAYS $1 in your script to not loose this information.
If you want to log all arguments correctly escaped (in the case where they contains, for example, consecutive spaces...) you HAVE to use "$#" with double quotes. "$#" is equivalent to "$1" "$2" "$3" "$4" etc.
So, to log arguments, I suggest the following at the start of the caller:
i=0
for arg in "$#"; do
echo "arg$i=$arg"
let ++i
done
## Example of calls to the previous script
#caller.sh '1' "2" 3 "4 4" "5 5"
#arg1=1
#arg2=2
#arg3=3
#arg4=4 4
#arg5=5 5
#Flimm is correct, there is no way to distinguish between arguments "foo" and foo, simply because the quotes are removed by the shell before the program receives them. What you need is "$#" (with the quotes).

Storing ksh input array to variable and passing to another script

I have to modify an existing ksh script which looks at the command-line arguments using 'shift', and so empties $#, but now want to pass the original arguments to a second script afterwards.
In the mainline case I can do this by coping $# to a variable and passing that to the second script, but I can't get it to work for quoted command-line arguments.
If I have a script called 'printer' like below:
#!/bin/ksh
INPUT=$#
echo "Printing args"
until [[ $# -eq 0 ]];do
echo $1
shift
done
./printer2 $INPUT
and printer2 like below:
#!/bin/ksh
echo "Printing second args"
until [[ $# -eq 0 ]];do
echo $1
shift
done
I would like the output of
./printer first second "third forth"
to be :
Printing args
first
second
third forth
Printing second args
first
second
third forth
I've tried various combinations of quotes around variables (both in the assignment of $INPUT and when passing it to printer2) but can't figure it out. Can anyone help?
Ok I think I've found the solution after an awful lot of trial and error.
Assigning $INPUT like this:
set -A INPUT "$#"
and then passing it like this:
./printer2 "${INPUT[#]}"
produces the output I'm after.
The whole first script is therefore:
#!/bin/ksh
set -A INPUT "$#"
echo "Printing args"
until [[ $# -eq 0 ]];do
echo $1
shift
done
./printer2 "${INPUT[#]}"
and
./printer first second "third fourth"
outputs:
Printing args
first
second
third fourth
Printing second args
first
second
third fourth
If anyone wants to explain the problem with the other things I tried, please do, as I'm still interested!

Using getopts within user-defined-function in bourne shell

Is it possible to pass command line arguments into a function from within a bourne script, in order to allow getopts to process them.
The rest of my script is nicely packed into functions, but it's starting to look like I'll have to move the argument processing into the main logic.
The following is how it's written now, but it doesn't work:
processArgs()
{
while getopts j:f: arg
do
echo "${arg} -- ${OPTARG}"
case "${arg}" in
j) if [ -z "${filename}" ]; then
job_number=$OPTARG
else
echo "Filename ${filename} already set."
echo "Job number ${OPTARG} will be ignored.
fi;;
f) if [ -z "${job_number}" ]; then
filename=$OPTARG
else
echo "Job number ${job_number} already set."
echo "Filename ${OPTARG} will be ignored."
fi;;
esac
done
}
doStuff1
processArgs
doStuff2
Is it possible to maybe define the function in a way that it can read the scripts args? Can this be done some other way? I like the functionality of getopts, but it looks like in this case I'm going to have to sacrifice the beauty of the code to get it.
You can provide args to getopts after the variable. The default is $#, but that's also what shell functions use to represent their arguments. Solution is to pass "$#" — representing all the script's command-line arguments as individual strings — to processArgs:
processArgs "$#"
Adding that to your script (and fixing the quoting in line 11), and trying out some gibberish test args:
$ ./try -j asdf -f fooo -fasdfasdf -j424pyagnasd
j -- asdf
f -- fooo
Job number asdf already set.
Filename fooo will be ignored.
f -- asdfasdf
Job number asdf already set.
Filename asdfasdf will be ignored.
j -- 424pyagnasd

Resources