Ambigous args when sourcing another bash file in a bash file - bash

I create a bash file test.sh. The content of this bash is like below:
#!/bin/bash
#source another file
export ICS_START=/rdrive/ics/itools/unx/bin/
source $ICS_START/icssetup.sh
XMAIN=false
MAINLINE=false
STARTDIR=${PWD}
# Get args.
usage() {
echo "Usage: $0 [-t <timestamp>] [-m] [-x]"
exit 1
}
parse_args(){
while getopts "ht:mx" OPT; do
case $OPT in
t) DATE=${OPTARG};;
m) MAINLINE=true;;
x) XMAIN=true;;
h) usage;;
?) usage;;
esac
done
}
echo "$#"
parse_args "$#"
#other commands
myrun -d xxx -p xxx --time xxxx
I run this bash file with ./test.sh -t xxx -m -x
During this process,the second source command is affected by the args -t xxx -m -x, it always throw errors as :
Ambigous switch. Please use more characters. I think icssetup.sh also define these args so we have conflicts with each other. How could I avoid this without changing arg characters?
I checked that the first two lines(source command) and the parse_args can both work well separately.
Any help would be appreciated.

This is something that happens with bash but not other shells. The arguments of your script are passed to any sourced script.
A simple example showing this:
test.sh
#!/bin/bash
source source.sh
echo Original Script: $# : $#
source.sh
#!/bin/bash
echo Sourced Script $# : $#
When you run test.sh, you see that even if no arguments are passed to the sourced script, it actually receives the original script arguments:
# ./test.sh a b
Sourced Script 2 : a b
Original script: 2 : a b
If you're attempting to pass no arguments to the sourced script, you might try to force it like:
source $ICS_START/icssetup.sh ""

Related

Why "bash -c" can't receive full list of arguments?

I have next two scripts:
test.sh:
echo "start"
echo $#
echo "end"
run.sh:
echo "parameters from user:"
echo $#
echo "start call test.sh:"
bash -c "./test.sh $#"
Execute above run.sh:
$ ./run.sh 1 2
parameters from user:
1 2
start call test.sh:
start
1
end
You could see although I pass 2 arguments to run.sh, the test.sh just receive the first argument.
But, if I change run.sh to next which just drop bash -c:
echo "parameters from user:"
echo $#
echo "start call test.sh:"
./test.sh $#
The behavior becomes as expected which test.sh receive 2 arguments:
$ ./run.sh 1 2
parameters from user:
1 2
start call test.sh:
start
1 2
end
Question:
For some reason, I have to use bash -c in my full scenario, then could you kindly tell me what's wrong here? How I could fix that?
It is because of the quoting of the arguments is in wrong place. When you run a sequence of commands inside bash -c, think of that as it being a full shell script in itself, and need to pass arguments accordingly. From the bash manual
If Bash is started with the -c option (see Invoking Bash), then $0 is set to the first argument after the string to be executed, if one is present. Otherwise, it is set to the filename used to invoke Bash, as given by argument zero.
But if one notices your command below,
bash -c "./test.sh $#"
when your expectation was to pass the arguments to the test.sh, inside '..', but the $# inside double-quotes expanded pre-maturely, undergoing word-splitting to produce the first argument value only, i.e. value of $1
But even when you have fixed it by using single quotes as below, it still can't work, because remember the contents passed to -c is evaluated in its own shell context and needs arguments passed explicitly,
set -- 1 2
bash -c 'echo $#' # Both the cases still don't work, as the script
bash -c 'echo "$#"' # inside '-c' is still not passed any arguments
To fix, the above, you need an explicit passing of arguments the contents inside -c as below. The _ (underscore) character represents the pathname of the shell invoked to execute the script (in this case bash). More at Bash Variables on the manual
set -- 1 2
bash -c 'printf "[%s]\n" "$#"' _ "$#"
[1]
[2]
So to fix your script, in run.sh, pass the arguments as
bash -c './test.sh "$#"' _ "$#"
Besides the accept one, find another solution just now. If add -x when call the run.sh, I could see next:
$ bash -x ./run.sh 1 2
+ echo 'parameters from user:'
parameters from user:
+ echo 1 2
1 2
+ echo 'start call test.sh:'
start call test.sh:
+ bash -c './test.sh 1' 2
start
1
end
So, it looks bash -c "./test.sh $#" is interpreted as bash -c './test.sh 1' 2.
Inspired from this, I tried to use $* to replace $#, which then just pass all params as a single parameter, then with next it also works well:
run.sh:
echo "parameters from user:"
echo $*
echo "start call test.sh:"
bash -c "./test.sh $*"
Execution:
$ bash -x ./run.sh 1 2
+ echo 'parameters from user:'
parameters from user:
+ echo 1 2
1 2
+ echo 'start call test.sh:'
start call test.sh:
+ bash -c './test.sh 1 2'
start
1 2
end

Including $# to pass on all command line arguments when a shell script invokes itself with bash -c

I need a bash script to invoke itself (actually in a different context, inside Docker container) and I'm using a bash -c command to do so. However, I'm struggling with how to pass on all command line variables, even after reading lots of related questions here. This is an example script:
#!/bin/bash
# If not in the right context, invoke script in right context and exit
if [ -z ${NESTED+x} ]; then
NESTED=true bash -c "./test.sh $#"
exit
fi
echo "$1"
echo "$2"
echo "$3"
If I save this as test.sh and call it with ./test.sh 1 2 "3 4" I'd want to see those arguments echo'ed, but only the first one is output.
If I use set -x it shows bash inserts some unexpected quoting so the call becomes NESTED=true bash -c './test2.sh 1' 2 3 4. That explains the output but I haven't been able to figure out the right way to do this.
bash -c should not be used as it cannot handle "3 4" easily:
#!/bin/bash
# If not in the right context, invoke script in right context and exit
if [ -z ${NESTED+x} ]; then
NESTED=true ./test.sh "$#"
exit
fi
echo "$1"
echo "$2"
echo "$3"

Redirect copy of stdin to file from within bash script itself

In reference to https://stackoverflow.com/a/11886837/1996022 (also shamelessly stole the title) where the question is how to capture the script's output I would like to know how I can additionally capture the scripts input. Mainly so scripts that also have user input produce complete logs.
I tried things like
exec 3< <(tee -ia foo.log <&3)
exec <&3 <(tee -ia foo.log <&3)
But nothing seems to work. I'm probably just missing something.
Maybe it'd be easier to use the script command? You could either have your users run the script with script directly, or do something kind of funky like this:
#!/bin/bash
main() {
read -r -p "Input string: "
echo "User input: $REPLY"
}
if [ "$1" = "--log" ]; then
# If the first argument is "--log", shift the arg
# out and run main
shift
main "$#"
else
# If run without log, re-run this script within a
# script command so all script I/O is logged
script -q -c "$0 --log $*" test.log
fi
Unfortunately, you can't pass a function to script -c which is why the double-call is necessary in this method.
If it's acceptable to have two scripts, you could also have a user-facing script that just calls the non-user-facing script with script:
script_for_users.sh
--------------------
#!/bin/sh
script -q -c "/path/to/real_script.sh" <log path>
real_script.sh
---------------
#!/bin/sh
<Normal business logic>
It's simpler:
#! /bin/bash
tee ~/log | your_script
The wonderful thing is your_script can be a function, command or a {} command block!

How to do named command line arguments in Bash Scripting better way?

This is my sample Bash Script example.sh:
#!/bin/bash
# Reading arguments and mapping to respective variables
while [ $# -gt 0 ]; do
if [[ $1 == *"--"* ]]; then
v="${1/--/}"
declare $v
fi
shift
done
# Printing command line arguments through the mapped variables
echo ${arg1}
echo ${arg2}
Now if in terminal I run the bash script as follows:
$ bash ./example.sh "--arg1=value1" "--arg2=value2"
I get the correct output like:
value1
value2
Perfect! Meaning I was able to use the values passed to the arguments --arg1 and --arg2 using the variables ${arg1} and ${arg2} inside the bash script.
I am happy with this solution for now as it serves my purpose, but, anyone can suggest any better solution to use named command line arguments in bash scripts?
You can just use environment variables:
#!/bin/bash
echo "$arg1"
echo "$arg2"
No parsing needed. From the command line:
$ arg1=foo arg2=bar ./example.sh
foo
bar
There's even a shell option to let you put the assignments anywhere, not just before the command:
$ set -k
$ ./example.sh arg1=hello arg2=world
hello
world

Bash script not running in Ubuntu

I'm getting started with bash scripting and made this little script following along a short guide but for some reason when I run the script with sh myscript I get
myscript: 5: myscript: 0: not found running on ubuntu 12.04
here is my script below I should at least see the echo message if no args are set:
#!/bin/bash
#will do something
name=$1
username=$2
if (( $# == 0 ))
then
echo "##############################"
echo "myscript [arg1] [arg2]"
echo "arg1 is your name"
echo "and arg2 is your username"
fi
var1="Your name is ${name} and your username is ${username}"
`echo ${var1} > yourname.txt`
`echo ${var1} > yourname.txt`
Get rid of the backticks.
echo ${var1} > yourname.txt
...for some reason when I run the script with sh myscript...
Don't run it that way. Make the script executable and run it directly
chmod +x myscript
./script
(or run with bash myscript explicitly).
It looks like that expression will work in bash but not in sh. As others pointed out change it to executable, make sure your shebang line is using bash and run it like this:
./myscript
If you want to run it with sh then it is complaining about line 5. Change it to this and it will work in /bin/sh.
if [ $# -ne 0 ]
Check out the man page for test.
Also you don't need the backticks on this line:
echo ${var1} > yourname.txt

Resources