Converting string to a list of paramaters in BASH [duplicate] - bash

This question already has answers here:
Variable containing multiple args with quotes in Bash
(4 answers)
Closed 2 years ago.
I have a bash script that launches another bash script and needs to pass multiple parameters (which may contain spaces). In the launcher script I am defining the parameters as a single string, escaping any spaces as necessary. However I can't seem to get the parameters passed properly.
Here is my test setup to replicate the problem I am having:
test.sh
while [[ $# -gt 0 ]]; do
echo "${1}"
shift
done
launcher.sh
#!/bin/bash
args="arg1 arg2 arg\ 3 arg4"
./test.sh ${args}
Running test.sh directly from command line (./test.sh arg1 arg2 arg\ 3 arg4)
arg1
arg2
arg 3
arg4
Running launcher.sh
arg1
arg2
arg\
3
arg4
I've tried multiple variations of double quotes, read, IFS, etc, but I can't seem to get the results I am looking for. Any guidance would be appreciated.

A friendly tip
After reading your entire question it seems you're trying to re-invent the wheel.
You should have tried read --help. It explains how to split user input into an indexed array.
Example
read -a args -p 'Input args: '
Full code example
test.sh
#!/bin/bash
for sArg in "$#" ;do
echo "$sArg"
done
launcher.sh
#!/bin/bash
read -a args -p 'Input args: '
./test.sh "${args[#]}"

Use a bash array or xargs in launcher.sh:
#!/bin/bash
args=(arg1 arg2 "arg 3" arg4)
./test.sh "${args[#]}"
echo =======================
args="arg1 arg2 arg\ 3 arg4"
echo $args | xargs ./test.sh
Execution:
$ ./launcher.sh
arg1
arg2
arg 3
arg4
=======================
arg1
arg2
arg 3
arg4

Related

Invalid behavior for arguments in Bash version 4.4 vs version 5.1? [duplicate]

This question already has an answer here:
Running a command with bash -c vs without
(1 answer)
Closed last year.
I am confused with this behavior, I have the following script:
backup.sh
#!/bin/bash -x
set -e
if [[ $# -eq 0 ]] ; then
echo 'No arguments passed'
exit 1
fi
# Get the arguments
for ARGUMENT in "$#"; do
KEY=$(echo $ARGUMENT | cut -f1 -d=)
VALUE=$(echo $ARGUMENT | cut -f2 -d=)
case "$KEY" in
backup_dir) BACKUP_DIR=${VALUE} ;;
postgres_dbs) POSTGRES_DBS=${VALUE} ;;
backup_name) BACKUP_NAME=${VALUE} ;;
postgres_port) POSTGRES_PORT=${VALUE} ;;
postgres_host) POSTGRES_HOST=${VALUE} ;;
*) ;;
esac
done
And I am executing it using:
1.
/bin/bash -c /usr/bin/backup.sh postgres_dbs=grafana,keycloak backup_name=postgres-component-test-20220210.165630 backup_dir=/backups/postgres postgres_port=5432 postgres_host=postgres.default.svc.cluster.local
/usr/bin/backup.sh postgres_dbs=grafana,keycloak backup_name=postgres-component-test-20220210.165630 backup_dir=/backups/postgres postgres_port=5432 postgres_host=postgres.default.svc.cluster.local
But the output is:
+ set -e
+ [[ 0 -eq 0 ]]
+ echo 'No arguments passed'
No arguments passed
+ exit 1
Environment:
# cat /etc/os-release
NAME="Ubuntu"
VERSION="18.04.3 LTS (Bionic Beaver)"
ID=ubuntu
ID_LIKE=debian
PRETTY_NAME="Ubuntu 18.04.3 LTS"
VERSION_ID="18.04"
HOME_URL="https://www.ubuntu.com/"
SUPPORT_URL="https://help.ubuntu.com/"
BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/"
PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy"
VERSION_CODENAME=bionic
UBUNTU_CODENAME=bionic
Bash version where I can reproduce this issue:
GNU bash, version 4.4.20(1)-release (x86_64-pc-linux-gnu)
However, this is not happening in the Bash version:
GNU bash, version 5.1.8(1)-release (x86_64-apple-darwin20.3.0)
It's not a bug, just a feature!
When you use the bash -c 'code …' style, actually the first CLI argument is passed to the inline code as $0, not $1.
Furthermore, if the 'code …' itself invokes an external script such as ./script.sh, then you should not forget to pass the arguments using the "$#" construct.
So you could just write (as pointed out in the comments):
bash -c './script.sh "$#"' bash "first argument"
Or most succinctly, just like you mention you had already tried:
bash script.sh "first argument"
Additional notes
As your example was not really "minimal" (it had a very long command-line), here is a complete minimal example that you might want to test for debugging purpose:
script.sh
#!/usr/bin/env bash
echo "\$#: $#"
for arg; do printf -- '- %s\n' "$arg"; done
Then you should get a session similar to:
$ chmod a+x script.sh
$ bash -c ./script.sh "arg 1" "arg 2"
$#: 0
$ bash -c './script.sh "$#"' "arg 1" "arg 2"
$#: 1
- arg 2
$ bash -c './script.sh "$#"' bash "arg 1" "arg 2"
$#: 2
- arg 1
- arg 2
$ bash script.sh "arg 1" "arg 2"
$#: 2
- arg 1
- arg 2
$ ./script.sh "arg 1" "arg 2"
$#: 2
- arg 1
- arg 2
You wrote two ways to invoke the script, which boil down to:
bash -c ./script.sh arg1 arg2 arg3
./script.sh arg1 arg2 arg3
The second way is the preferred way to invoke scripts. Running them directly tells Linux to use the interpreter listed in the shebang line. There's no reason I can see for this invocation style to drop arguments.
The first, however, does indeed lose all the arguments. It's because -c doesn't belong there. If you want to invoke an explicit bash shell you should write simply:
bash ./script.sh arg1 arg2 arg3
That will correctly pass all the arguments to the script.
When you add -c it turns ./script.sh from the name of a script into a full blown command line. What's the difference? Well, now that command line is responsible for forwarding its arguments to the script, if that's what it wants to have happen. With -c you need to explicitly pass them on:
bash -c './script.sh "$#"' bash arg1 arg2 arg3
Yuck! It's encased in single quotes, and there's an ugly "$#" in there. It's needed, though. Without "$#" the arguments are simply dropped on the floor.
-c also takes an extra argument, the value for $0. So not only is "$#" needed, you also have to add an extra bash argument to set $0. (bash is a good choice since that's what $0 is normally set to when running a bash script.)

Why "bash -c" can't receive full list of arguments?

I have next two scripts:
test.sh:
echo "start"
echo $#
echo "end"
run.sh:
echo "parameters from user:"
echo $#
echo "start call test.sh:"
bash -c "./test.sh $#"
Execute above run.sh:
$ ./run.sh 1 2
parameters from user:
1 2
start call test.sh:
start
1
end
You could see although I pass 2 arguments to run.sh, the test.sh just receive the first argument.
But, if I change run.sh to next which just drop bash -c:
echo "parameters from user:"
echo $#
echo "start call test.sh:"
./test.sh $#
The behavior becomes as expected which test.sh receive 2 arguments:
$ ./run.sh 1 2
parameters from user:
1 2
start call test.sh:
start
1 2
end
Question:
For some reason, I have to use bash -c in my full scenario, then could you kindly tell me what's wrong here? How I could fix that?
It is because of the quoting of the arguments is in wrong place. When you run a sequence of commands inside bash -c, think of that as it being a full shell script in itself, and need to pass arguments accordingly. From the bash manual
If Bash is started with the -c option (see Invoking Bash), then $0 is set to the first argument after the string to be executed, if one is present. Otherwise, it is set to the filename used to invoke Bash, as given by argument zero.
But if one notices your command below,
bash -c "./test.sh $#"
when your expectation was to pass the arguments to the test.sh, inside '..', but the $# inside double-quotes expanded pre-maturely, undergoing word-splitting to produce the first argument value only, i.e. value of $1
But even when you have fixed it by using single quotes as below, it still can't work, because remember the contents passed to -c is evaluated in its own shell context and needs arguments passed explicitly,
set -- 1 2
bash -c 'echo $#' # Both the cases still don't work, as the script
bash -c 'echo "$#"' # inside '-c' is still not passed any arguments
To fix, the above, you need an explicit passing of arguments the contents inside -c as below. The _ (underscore) character represents the pathname of the shell invoked to execute the script (in this case bash). More at Bash Variables on the manual
set -- 1 2
bash -c 'printf "[%s]\n" "$#"' _ "$#"
[1]
[2]
So to fix your script, in run.sh, pass the arguments as
bash -c './test.sh "$#"' _ "$#"
Besides the accept one, find another solution just now. If add -x when call the run.sh, I could see next:
$ bash -x ./run.sh 1 2
+ echo 'parameters from user:'
parameters from user:
+ echo 1 2
1 2
+ echo 'start call test.sh:'
start call test.sh:
+ bash -c './test.sh 1' 2
start
1
end
So, it looks bash -c "./test.sh $#" is interpreted as bash -c './test.sh 1' 2.
Inspired from this, I tried to use $* to replace $#, which then just pass all params as a single parameter, then with next it also works well:
run.sh:
echo "parameters from user:"
echo $*
echo "start call test.sh:"
bash -c "./test.sh $*"
Execution:
$ bash -x ./run.sh 1 2
+ echo 'parameters from user:'
parameters from user:
+ echo 1 2
1 2
+ echo 'start call test.sh:'
start call test.sh:
+ bash -c './test.sh 1 2'
start
1 2
end

Run sh script in Tcl foreach loop

I have a simple bash script, test.sh, that takes four arguments.
#!/bin/bash
echo "1: $1"
echo "2: $2"
echo "3: $3"
echo "4: $4"
I try to call this from a Tcl script, test.tcl
exec bash test.sh arg1 arg2 arg3 arg4
foreach i {1 2 3} {
exec bash test.sh arg1 arg2 arg3 arg4
}
The first call to the script outputs as I expect it to, but the calls from the foreach loop never seem to do anything. In fact, the exec command can be replaced with exec ls to make things even simpler; the call outside the loop works fine but the one from inside the loop doesn't do anything.
EDIT
As pointed out in the comments, it's probably important to mention I am using a Tcl console that is built into a software package (VMD, visual molecular dynamics). From that terminal interface, I call these scripts "interactively," and see output on the terminal from the exec outside the loop, but not from the one inside the loop.
My work is on hold because of this, any ideas?
The apparent "issue" stems from trying to run these scripts "interactively." If I modify the bash script as
#!/bin/bash
echo "1: $1" > $5
echo "2: $2" >>$5
echo "3: $3" >>$5
echo "4: $4" >>$5
and the tcl script as
exec bash test.sh arg1 arg2 arg3 arg4 file1.txt
foreach i {1 2 3} {
exec bash test.sh arg1 arg2 arg3 arg4 file2.txt
}
I see both files, file{1,2}.txt, created properly. So it automatically prints to the terminal interface when the script is called outside the loop but not when called inside. This is explained more in the comments above.

How to do named command line arguments in Bash Scripting better way?

This is my sample Bash Script example.sh:
#!/bin/bash
# Reading arguments and mapping to respective variables
while [ $# -gt 0 ]; do
if [[ $1 == *"--"* ]]; then
v="${1/--/}"
declare $v
fi
shift
done
# Printing command line arguments through the mapped variables
echo ${arg1}
echo ${arg2}
Now if in terminal I run the bash script as follows:
$ bash ./example.sh "--arg1=value1" "--arg2=value2"
I get the correct output like:
value1
value2
Perfect! Meaning I was able to use the values passed to the arguments --arg1 and --arg2 using the variables ${arg1} and ${arg2} inside the bash script.
I am happy with this solution for now as it serves my purpose, but, anyone can suggest any better solution to use named command line arguments in bash scripts?
You can just use environment variables:
#!/bin/bash
echo "$arg1"
echo "$arg2"
No parsing needed. From the command line:
$ arg1=foo arg2=bar ./example.sh
foo
bar
There's even a shell option to let you put the assignments anywhere, not just before the command:
$ set -k
$ ./example.sh arg1=hello arg2=world
hello
world

run a function or alias set in bashrc, or profile through nohup [duplicate]

This question already has answers here:
Call a function using nohup
(6 answers)
Closed 9 years ago.
My question is similar to this one.
Using aliases with nohup
I took a lot of time customizing a function that I included in my .bashrc
I'd like it to run with nohup, because I'd like to run a command several times in this fashion.
for i in `cat mylist`; do nohup myfunction $i 'mycommand' & done
Any tips?
You can do this with functions (not aliases) by nohuping a bash -c (which is essentially the same as running an external bash script).
In order for this to work, you need to mark your function as exported:
# define the function
echo_args() {
printf '<%s> ' "$#"
printf "\n"
}
# mark it as exported
declare -fx echo_args
# run it with nohup
nohup bash -c 'echo_args "$#"' bash_ "an argument" "another argument"
The argument bash_ to nohup provides a "name" for the bash -c subshell; that is, it becomes the value of $0 in the subshell. It will be prepended to error messages (if any), so I try to use something meaningful.
nohup will not work with functions. You need to create a shell script which wraps and executes the function. The shell script then you can run with nohup
Like this:
test.sh
#!/bin/bash
function hello_world {
echo "hello $1, $2"
}
# call function
hello_world "$1" "$2"
chmod +x test.sh and then call it in your for loop:
for i in `cat mylist`; do
nohup ./test.sh $i 'mycommand' &
done

Resources