Storing ksh input array to variable and passing to another script - ksh

I have to modify an existing ksh script which looks at the command-line arguments using 'shift', and so empties $#, but now want to pass the original arguments to a second script afterwards.
In the mainline case I can do this by coping $# to a variable and passing that to the second script, but I can't get it to work for quoted command-line arguments.
If I have a script called 'printer' like below:
#!/bin/ksh
INPUT=$#
echo "Printing args"
until [[ $# -eq 0 ]];do
echo $1
shift
done
./printer2 $INPUT
and printer2 like below:
#!/bin/ksh
echo "Printing second args"
until [[ $# -eq 0 ]];do
echo $1
shift
done
I would like the output of
./printer first second "third forth"
to be :
Printing args
first
second
third forth
Printing second args
first
second
third forth
I've tried various combinations of quotes around variables (both in the assignment of $INPUT and when passing it to printer2) but can't figure it out. Can anyone help?

Ok I think I've found the solution after an awful lot of trial and error.
Assigning $INPUT like this:
set -A INPUT "$#"
and then passing it like this:
./printer2 "${INPUT[#]}"
produces the output I'm after.
The whole first script is therefore:
#!/bin/ksh
set -A INPUT "$#"
echo "Printing args"
until [[ $# -eq 0 ]];do
echo $1
shift
done
./printer2 "${INPUT[#]}"
and
./printer first second "third fourth"
outputs:
Printing args
first
second
third fourth
Printing second args
first
second
third fourth
If anyone wants to explain the problem with the other things I tried, please do, as I'm still interested!

Related

Reduce the output with bash

I am trying to run a basic bash program with two outputs, the program works and the tests passes. The thing is I want to reduce the output with only one "echo", when there's an input, show One for "input" and when is empty, One for you. This works but I want the out put in one line.
if (($# == 0))
then
echo "One for you, one for me."
else
echo "One for $1, one for me."
fi
echo "One for ${1:-you}, one for me."
When $1 is null or unset, then place you there.
For reference see bash manual shell parameter expansion.
Alternatively, if you have zero arguments, you could set "default" ones.
if (($# == 0)); then
set -- you
fi
echo "One for $1, one for me."
which is equivalent, because it will handle empty argument $1="" differently.
If you want shorter code, this may help:
other="you"; [[ $# -ne 0 ]] && other="$1"
echo "One for ${other}, one for me."
The first line defaults the other variable to be you and then overwrites it if you have provided a name. The second line just prints it out for you.
That won't help if the user supplies an empty name (as with script.sh "") so you may instead want to just check the variable itself rather than the count:
other="you"; [[ -n "$1" ]] && other="$1"
echo "One for ${other}, one for me."

Command online argument in shell script

Hi I have written small shell script, I am not able to understand the behavior of that script. can any one help me to understand that script.
Script:
#!/bin/bash
if [ -z $1 ]
then
echo "fail"
else
echo "success"
fi
While executing the script .
./test.sh one
It exuting the else statement instead of main statement , even though its passing the argument.
can any one explain me this behavior to understand
The -z test in bash is checking if a string is an empty (zero length) value.
Since you're passing an argument to the script $1 is not empty and therefore -z $1 evaluates to false, executing the else portion of your script.
Side note: Since you're working with strings I recommend you to quote variables as follows:
if [ -z "$1" ]; then
echo "String is empty / No argument given"
else
echo "String is not empty / Argument given"
fi
Edit:
As pointed out by user1934428 it's probably better to use [[ instead of [. This, among others, eliminates the need for quoting. See more differences here.
if [[ -z $1 ]]; then
...
However, be aware that this is a bash extension and won't work in sh scripts.

How to call an application in Bash using an argument with spaces

I have a bash file which is passed arguments which contain spaces. The bash file looks like:
#!/bin/bash
another_app "$1"
However, instead of processing the argument as a single argument, as I believe it should, it processes it as a number of arguments depending on how many spaces. For example, if I call my bash file such:
my_app "A Number Of Words"
Then the "another_app" application gets passed 4 different arguments, instead of one. How can I just pass the single argument through to the second application?
The others are correct it will depend somewhat on how the 2nd app handles the args. You can also have a little control as to how the args are passed. You can do this with some quoting or using the "$#" var as mentioned by #steve
For example app1.sh
#!/bin/bash
echo "Argument with no quotes"
./another_app.sh $1
echo "Argument with quotes"
./another_app.sh "$1"
echo "Argument with \$#"
./another_app.sh "$#"
and another_app.sh
#!/bin/bash
echo "Inside $0"
echo "Number of args passed to me: $#"
for X in "${#}"
do
echo $X
done
echo "Exiting $0"
Call the second application using "$#":
#!/bin/bash
another_app "$#"

Bash Script to read all inputs on command line

Say I let a user input as many arguments as he wants, how do read all the inputs?
So if a user typed in asdf asfd asdf, it should say 3 arguments
Right now I have
#!/bin/bash
read $#
echo "There are $# arguments"
but whenever I type in anything it always equals 0
If you want to read all the parameters from a variable using the same style as if you would have done it from the command line. You could do it in a function
#!/bin/bash
function param_count() {
for arg in $#
do
echo " arg: $arg"
done
echo "arg count= $#"
}
read foo
param_count $foo
If you really want to read a line as a sequence of words, your best bet in bash is to read it into an array:
$ read -a words
And now a few words from our sponsor
$ echo ${#words[#]}
8
$ echo "${words[3]}"
few
But that's not the way to pass arguments to a shell script. read just splits lines into words using whitespace (or whatever IFS is set to.) It does not:
Handle quotes
Allow the insertion of shell variables
Expand pathname patterns
Passing arguments on the command line lets you do all of the above, making the utility more convenient to use, and does not require any extra work to extract the arguments, making the utility easier to write.
This is the way to get it:
read ARGUMENTS
set -- $ARGUMENTS
echo "There are $# arguments"
Explanation:
read ARGUMENTS
Read the input and save it into ARGUMENTS
set -- $ARGUMENTS
Set the positional parameters to the given input
At this point, you can use that input as if it was given in the command-line.
e.g.:
echo "$1"
echo "$2"
...
Delete read $# line.
Then launch your script with arguments. Just like this:
/path/to/script arg1 arg2 arg3 4 5 6
The output should be:
There are 6 arguments
Here is an example script that shows you how to get the argument count, access the individual arguments and read in a new variable:
#!/usr/bin/env bash
echo "There are $# arguments, they are:"
for arg in $#
do
echo " - $arg"
done
# Access the aguments
echo "The first argument is $1"
echo "The second argument is $2"
echo "The third argument is $3"
echo "All arguments: $#"
# Read a variable
read var
echo "Some var: $var"

Using getopts within user-defined-function in bourne shell

Is it possible to pass command line arguments into a function from within a bourne script, in order to allow getopts to process them.
The rest of my script is nicely packed into functions, but it's starting to look like I'll have to move the argument processing into the main logic.
The following is how it's written now, but it doesn't work:
processArgs()
{
while getopts j:f: arg
do
echo "${arg} -- ${OPTARG}"
case "${arg}" in
j) if [ -z "${filename}" ]; then
job_number=$OPTARG
else
echo "Filename ${filename} already set."
echo "Job number ${OPTARG} will be ignored.
fi;;
f) if [ -z "${job_number}" ]; then
filename=$OPTARG
else
echo "Job number ${job_number} already set."
echo "Filename ${OPTARG} will be ignored."
fi;;
esac
done
}
doStuff1
processArgs
doStuff2
Is it possible to maybe define the function in a way that it can read the scripts args? Can this be done some other way? I like the functionality of getopts, but it looks like in this case I'm going to have to sacrifice the beauty of the code to get it.
You can provide args to getopts after the variable. The default is $#, but that's also what shell functions use to represent their arguments. Solution is to pass "$#" — representing all the script's command-line arguments as individual strings — to processArgs:
processArgs "$#"
Adding that to your script (and fixing the quoting in line 11), and trying out some gibberish test args:
$ ./try -j asdf -f fooo -fasdfasdf -j424pyagnasd
j -- asdf
f -- fooo
Job number asdf already set.
Filename fooo will be ignored.
f -- asdfasdf
Job number asdf already set.
Filename asdfasdf will be ignored.
j -- 424pyagnasd

Resources