I want to pass arguments to a script in the form
./myscript.sh -r [1,4] -p [10,20,30]
where in myscript.sh if I do:
echo $#
But I'm getting the output as
-r 1 4 -p 1 2 3
How do I get output in the form of
-r [1,4] -p [10,20,30]
I'm using Ubuntu 12.04 and bash version 4.2.37
You have files named 1 2 3 & 4 in your working directory.
Use more quotes.
./myscript.sh -r "[1,4]" -p "[10,20,30]"
[1,4] gets expanded by bash to filenames named 1 or , or 4 (whichever are actually present on your system).
Similarly, [10,20,30] gets expanded to filenames named 1 or 0 or , or 2 or 3.
On similar note, you should also change echo $# to echo "$#"
On another note, if you really want to distinguish between the arguments, use printf '%s\n' "$#" instead of just echo "$#".
You can turn off filename expansion
set -f
./myscript.sh -r [1,4] -p [10,20,30]
Don't expect other users to want to do this, if you share your script.
The best answer is anishane's: just quote the arguments
./myscript.sh -r "[1,4]" -p "[10,20,30]"
You can just the escape the brackets[]. Like this,
./verify.sh -r \[1,4\] -p \[10,20,30\]
You can print this using the echo "$#"
Related
I have a bash script that looks like below.
$TOOL is another script which runs 2 times with different inputs(VAR1 and VAR2).
#Iteration 1
${TOOL} -ip1 ${VAR1} -ip2 ${FINAL_PML}/$1$2.txt -p ${IP} -output_format ${MODE} -o ${FINAL_MODE_DIR1}
rename mods mode_c_ ${FINAL_MODE_DIR1}/*.xml
#Iteration 2
${TOOL} -ip1 ${VAR2} -ip2 ${FINAL_PML}/$1$2.txt -p ${IP} -output_format ${MODE} -o ${FINAL_MODE_DIR2}
rename mods mode_c_ ${FINAL_MODE_DIR2}/*.xml
Can I make these 2 iterations in parallel inside a bash script without submitting it in a queue?
If I read this right, what you want is to run them in background.
c.f. https://linuxize.com/post/how-to-run-linux-commands-in-background/
More importantly, if you are going to be writing scripts, PLEASE read the following closely:
https://www.gnu.org/software/bash/manual/html_node/index.html#SEC_Contents
https://mywiki.wooledge.org/BashFAQ/001
I'm trying to kick off multiple processes to work through some test suites. In my bash script I have the following
printf "%s\0" "${SUITE_ARRAY[#]}" | xargs -P 2 -0 bash -c 'run_test_suite "$#" ${EXTRA_ARG}'
Below is the defined script, cut down to it's basics.
SUITE_ARRAY will be a list of suites that may have 1 or more, {Suite 1, Suite 2, ..., Suite n}
EXTRA_ARG will be like a specific name to store values in another script
#!/bin/bash
run_test_suite(){
suite=$1
someArg=$2
someSaveDir=someArg"/"suite
# some preprocess work happens here, but isn't relevant to running
runSomeScript.sh suite someSaveDir
}
export -f run_test_suite
SUITES=$1
EXTRA_ARG=$2
IFS=','
SUITECOUNT=0
for csuite in ${SUITES}; do
SUITE_ARRAY[$SUITECOUNT]=$csuite
SUITECOUNT=$(($SUITECOUNT+1))
done
unset IFS
printf "%s\0" "${SUITE_ARRAY[#]}" | xargs -P 2 -0 bash -c 'run_test_suite "$#" ${EXTRA_ARG}'
The issue I'm having is how to get the ${EXTRA_ARG} passed into xargs. From how I've come to understand it, xargs will take whatever is piped into it, so the way I have it doesn't seem correct.
Any suggestions on how to correctly pass the values? Thanks in advance
If you want EXTRA_ARG to be available to the subshell, you need to export it. You can do that either explicitly, with the export keyword, or by putting the var=value assignment in the same simple command as xargs itself:
#!/bin/bash
run_test_suite(){
suite=$1
someArg=$2
someSaveDir=someArg"/"suite
# some preprocess work happens here, but isn't relevant to running
runSomeScript.sh suite someSaveDir
}
export -f run_test_suite
# assuming that the "array" in $1 is comma-separated:
IFS=, read -r -a suite_array <<<"$1"
# see the EXTRA_ARG="$2" just before xargs on the same line; this exports the variable
printf "%s\0" "${suite_array[#]}" | \
EXTRA_ARG="$2" xargs -P 2 -0 bash -c 'run_test_suite "$#" "${EXTRA_ARG}"' _
The _ prevents the first argument passed from xargs to bash from becoming $0, and thus not included in "$#".
Note also that I changed "${suite_array[#]}" to be assigned by splitting $1 on commas. This or something like it (you could use IFS=$'\n' to split on newlines instead, for example) is necessary, as $1 cannot contain a literal array; every shell command-line argument is only a single string.
This is something of a guess:
#!/bin/bash
run_test_suite(){
suite="$1"
someArg="$2"
someSaveDir="${someArg}/${suite}"
# some preprocess work happens here, but isn't relevant to running
runSomeScript.sh "${suite}" "${someSaveDir}"
}
export -f run_test_suite
SUITE_ARRAY="$1"
EXTRA_ARG="$2"
printf "%s\0" "${SUITE_ARRAY[#]}" |
xargs -n 1 -I '{}' -P 2 -0 bash -c 'run_test_suite {} '"${EXTRA_ARG}"
Using GNU Parallel it looks like this:
#!/bin/bash
run_test_suite(){
suite="$1"
someArg="$2"
someSaveDir="$someArg"/"$suite"
# some preprocess work happens here, but isn't relevant to running
echo runSomeScript.sh "$suite" "$someSaveDir"
}
export -f run_test_suite
EXTRA_ARG="$2"
parallel -d, -q run_test_suite {} "$EXTRA_ARG" ::: "$1"
Called as:
mytester 'suite 1,suite 2,suite "three"' 'extra "quoted" args here'
If you have the suites in an array:
parallel -q run_test_suite {} "$EXTRA_ARG" ::: "${SUITE_ARRAY[#]}"
Added bonus: Any output from the jobs will not be mixed, so you will not have to deal with http://mywiki.wooledge.org/BashPitfalls#Using_output_from_xargs_-P
When I run the following script in bash:
#!/bin/bash
names=("ALL" "no_C" "no_R" "no_Q")
for name in $names; do
export name=$name
mkdir -p $name
( echo 'selection' 'System' | gmx cluster -f ${name}_protein_only.trr -s ${name}_protein_only.pdb -n ${name}_index.ndx -g ${name}/cluster.log -cutoff 0.2 -fit -method gromos -o ${name}/cluster.output -dist ${name}/rmsd-dist.xvg -av -cl ${name}/clusters.pdb ) &
done
wait
The for loop won't loop until the subshell has completed, even though I've put it into the background with '&'. If I run this same script in zsh, it runs as expected (4 parallel tasks). Is this a bug or am I missing something?
You need to use a different notation for all the elements of an array in Bash (see Arrays and Shell parameter expansion):
for name in "${name[#]}"; do
When you specify $name and name is an array, Bash treats it as ${name[0]}.
I used this variant on your code to demonstrate:
#!/bin/bash
names=("ALL" "no_C" "no_R" "no_Q")
for name in "${names[#]}"
do
export name=$name
mkdir -p $name
( echo $name 'selection' 'System' | sed s/s/z/g; sleep 3 ) &
done
wait
I believe what Jonathan Leffler wrote is correct in terms of the definition of a Bash array and how to loop through it.
But, if you wanted to define names as a series of strings like this:
names="ALL no_C no_R no_Q"
Then you could still loop through it using:
for name in $names; do
.... do something with name
done
This question already has answers here:
Capturing output of find . -print0 into a bash array
(13 answers)
Closed 7 years ago.
I am currently a bash script that shall check some data. What I got so far is:
!/bin/bash
#!/bin/bash
find "./" -mindepth 1 -maxdepth 1 -type d -print0 | while IFS= read -r -d '' file; do
folder=${file##*/}
echo "Checking ${folder} for sanity..."
./makeconfig ${folder} | while read -r line; do
title=`echo $line | awk -F' ' '{print $2}'`
echo $title
done
done
Now what it currently does is: Search every directory in ./ and extract the folders name (thus: removing the ./ from the result of find). Then give it to a self-written tool, which will output some lines like this:
-t 1 -a 2
-t 3 -a 5
-t 7 -a 7
-t 9 -a 8
of which I gather the value behind -t via awk. This also works so far, the problem is, the outer while loop stops after the first iteration, thus checking only one folder. My guess is that the two read commands of the inner and outer loop are colliding somehow. The tool makeconfig definitiveley returns 0 (no error) always. I tried to debug it using sh -x script.sh but it does not show me anything I can deal with.
Can someone point me in the right direction here what is going wrong? If you need ANY further informations, I can give them to you. Ive written a quick mimicking program if you want to test the bash script here (also a script now, just echoing some stuff), just make it executable via chmod +x:
echo "-t 3 -a 4"
echo "-t 6 -a 1"
echo "-t 9 -a 5"
Just put this with the script in a folder and create some subfolders, that should do it to make it work (as much as it does).
Thanks in advance!
EDIT: This is NOT a duplicate as mentioned. The problem here are more the nested read commands than the print0 (maybe that has also something to do with it, but not entirely).
IFS= isn't setting the field separator to the null string (\0), but unsetting it entirely, so the entire output of the find command is being read at once. If you run it without the -print0 argument to find it'll be easier to work with in bash. Two other alternatives:
use xargs to run a shell script on each item found with that being the sole argument
use -exec to run the shell script on each item.
I made this script that should receive one or more parameter, and those parameter are all directories, and it has to browse those directories (one by one) and do some operations.
The operations work fine if the parameter is 1 (only one directory),
How should I modify my script to make it works if more than 1 parameter is passed
Example if I want it to do the same operations in 2 or 3 directories at the same time?
Thanks
#!/bin/sh
cd $1
for file in ./* # */
do
if [[ -d $file ]]
then
ext=dir
else
ext="${file##*.}"
fi
mv "${file}" "${file}.$ext"
done
First, if you are using bash use bash shebang (#! /bin/bash).
Then use
#! /bin/bash
for d in "$#"
do
echo "Do something with $d"
done
to iterate over the command line arguments (dirs in your case)
#!/bin/sh
for dir in "$#"; do
for file in "$dir"/*; do
echo "Doing something with '$file'"
done
done