I have a large number of .pbs files that I want to submit to a remote cluster. I want to be able to name the .pbs file something like "param1_123_param2_45.pbs", and then feed them into the ARGS for a Julia code. Below is an example .pbs of what I'm trying to do:
1 #!/bin/tcsh
2 #PBS -l mem=10gb,nodes=1:ppn=2,walltime=1:00:00
3 #PBS -j oe
4 #PBS -o ./log/julia.${PBS_JOBID}.out
5 #PBS -t 1-3
6
7 module load julia/1.5.1 python/3.8.1
8
9 cd path/to/file
10
11 julia Example.jl 123 45
Except 123 & 45 are replaced by some general terms given in the name of the .pbs file. Is there an easy way to do this?
You can use AWK. For example, content of file param1_123_param2_45.pbs:
PARAM1=$( echo ${0%%.pbs} | awk -F "_" '{print $2}' )
PARAM2=$( echo ${0%%.pbs} | awk -F "_" '{print $4}' )
echo "Filename: $0"
echo "Param 1: $PARAM1"
echo "Param 2: $PARAM2"
Running: bash param1_123_param2_45.pbs
Output:
Filename: param1_123_param2_45.pbs
Param 1: 123
Param 2: 45
Related
This is an extension of a previous question I asked:
Using name of BASH script as input argument
My goal is to write a BASH script which takes the arguments from the file's name and uses them as inputs for a Julia code I'm writing, and then submit the BASH script to a remote cluster. Using #AndriyMakukha's solution, I was able to write the following script through Torque:
#!/bin/bash
#PBS -l mem=10gb,nodes=1:ppn=2,walltime=1:00:00
#PBS -N ES_100_20_100
#PBS -j oe
#PBS -o ./log/julia.${PBS_JOBID}.out
module load julia/1.5.1 python/3.8.1
PARAM1=$( echo ${0%%.pbs} | awk -F "_" '{print $2}' )
PARAM2=$( echo ${0%%.pbs} | awk -F "_" '{print $3}' )
PARAM3=$( echo ${0%%.pbs} | awk -F "_" '{print $4}' )
PARAM4=$( echo ${0%%.pbs} | awk -F "_" '{print $5}' )
PARAM5=$( echo ${0%%.pbs} | awk -F "_" '{print $6}' )
echo "Param 1: $PARAM1"
echo "Param 2: $PARAM2"
echo "Param 3: $PARAM3"
echo "Param 4: $PARAM4"
echo "Param 5: $PARAM5"
cd path/to/code
julia Example.jl $PARAM1 $PARAM1 $PARAM2 $PARAM3 $PARAM4 $PARAM5
This script (called "Example_1_2_3_4_5.pbs") prints the different parameters from the filename to the output file, and then runs the Julia code with said parameters as the ARGS. When I run this on the local machine, it works great. When I submit the code to the cluster via qsub, I get the following error in the output file:
Param 1: priv/jobs/3574314-1.orion.cm.cluster.SC
Param 2:
Param 3:
Param 4:
Param 5:
ERROR: LoadError: ArgumentError: invalid base 10 digit 'p' in "priv/jobs/3574314-1.cluster_name.cm.cluster.SC"
Obviously, the code isn't reading the parameters correctly; i.e., it returns the name of the job, not the name of the BASH file itself. This is obvious because
echo ${0%%.pbs}
returns
/cm/local/apps/torque/var/spool/mom_priv/jobs/3574314-1.orion.cm.cluster.SC
How can I get the name of the pbs file itself if I submit to cluster, seeing as ${0%%.pbs} doesn't work?
I have the following line of code:
for h in "${Hosts[#]}" ; do echo "$MyLog" | grep -m 1 -B 3 -A 1 $h >> /LogOutput ; done
My hosts variable is a large array of hosts
Is there a better way to do this that doesn't require me to echo on each loop? Like grep on a variable instead?
No echo, no loop
#!/bin/bash
hosts=(host1 host2 host3)
MyLog="
asf host
sdflkj
sadkjf
sdlkjds
lkasf
sfal
asf host2
sdflkj
sadkjf
"
re="${hosts[#]}"
egrep -m 1 -B 3 -A 1 ${re// /|} <<< "$MyLog"
Variant with one echo
echo "$MyLog" | egrep -m 1 -B 3 -A 1 ${re// /|}
Usage
$ ./test
sdlkjds
lkasf
sfal
asf host2
sdflkj
One echo, no loops, and all grepping done in parallel, with GNU Parallel:
echo "$MyLog" | parallel -k --tee --pipe 'grep -m 1 -B 3 -A 1 {}' ::: "${hosts[#]}"
The -k keeps the output in order.
The --tee and the --pipe ensure that the stdin is duplicated to all processes.
The processes that are run in parallel are enclosed in single quotes.
printf your string to multiple-line that you can then grep? Something like:
printf '%s\n' "${Hosts[#]}" | grep -m 1 -B 3 -A 1 $h >> /LogOutput
Assuming you're on GNU system. otherwise info grep
From grep --help
grep --help | head -n1
Output
Usage: grep [OPTION]... PATTERN [FILE]...
So according to that you can do.
for h in "${Hosts[#]}" ; do grep -m 1 -B 3 -A 1 "$h" "$MyLog" >> /LogOutput ; done
I have a script which takes in only one positional parameter which is a list of values, and I'm trying to get the parameter from stdin with xargs.
However by default, xargs passes all the lists to my script as positional parameters, e.g. when doing:
echo 1 2 3 | xargs myScript, it will essentially be myScript 1 2 3, and what I'm looking for is myScript "1 2 3". What is the best way to achieve this?
Change the delimiter.
$ echo 1 2 3 | xargs -d '\n' printf '%s\n'
1 2 3
Not all xargs implementations have -d though.
And not sure if there is an actual use case for this but you can also resort to spawning another shell instance if you have to. Like
$ echo -e '1 2\n3' | xargs sh -c 'printf '\''%s\n'\'' "$*"' sh
1 2 3
If the input can be altered, you can do this. But not sure if this is what you wanted.
echo \"1 2 3\"|xargs ./myScript
Here is the example.
$ cat myScript
#!/bin/bash
echo $1; shift
echo $1; shift
echo $1;
$ echo \"1 2 3\"|xargs ./myScript
1 2 3
$ echo 1 2 3|xargs ./myScript
1
2
3
This question already has answers here:
How to cat <<EOF >> a file containing code?
(5 answers)
Closed 4 years ago.
I got the following error
awk: cmd. line:1: {if(NR%4==2) print length(.)}
awk: cmd. line:1: ^ syntax error
While running sh s.sh .
#!/bin/bash
#usage: sh afterqc_pbs.sh /work/waterhouse_team/All_RawData/Banana_Each_Cell_Raw/Banana_Illumina/QUT_WT_GN366_4
for r1 in $(find $1 -name "*R1*.gz");
do
base=${r1%_R1*}
echo $base
output=$(basename $(echo $r1 | sed 's/R1//g'))
r2=$(echo $r1 | sed 's/R1/R2/g')
echo $r1
echo $r2
#cat <<EOF
qsub <<EOF
#!/bin/bash -l
#PBS -N $output
#PBS -l walltime=48:00:00
#PBS -j oe
#PBS -l mem=50G
#PBS -l ncpus=1
#PBS -M m.lorenc#qut.edu.au
##PBS -m bea
cd \$PBS_O_WORKDIR
echo "Average\n"
zcat $r1 | awk '{if(NR%4==2) {count++; bases += length} } END{print bases/count}'
echo "Distribution\n"
zcat $r1 | awk '{if(NR%4==2) print length($1)}' | sort -n | uniq -c
EOF
done
How is it possible that awk does not picks up . rather than the length?
Thank you in advance.
The issue is your use of qsub <<EOF. When you use << (called a "here document" or "heredoc"), any variables inside get expanded. The single quotes around them don't count, as the heredoc takes priority. If you want to change this, change <<EOF to <<"EOF". https://www.gnu.org/software/bash/manual/bashref.html#Here-Documents has more details.
I have a command line tool which receives two arguments:
TOOL arg1 -o arg2
I would like to invoke it with the same argument provided it for arg1 and arg2, and to make that easy for me, i thought i would do:
each <arg1_value> | TOOL $1 -o $1
but that doesn't work, $1 is not replaced, but is added once to the end of the commandline.
An explicit example, performing:
cp fileA fileA
returns an error fileA and fileA are identical (not copied)
While performing:
echo fileA | cp $1 $1
returns the following error:
usage: cp [-R [-H | -L | -P]] [-fi | -n] [-apvX] source_file target_file
cp [-R [-H | -L | -P]] [-fi | -n] [-apvX] source_file ... target_directory
any ideas?
If you want to use xargs, the [-I] option may help:
-I replace-str
Replace occurrences of replace-str in the initial-arguments with names read from standard input. Also, unquoted blanks do not terminate input items; instead the separa‐
tor is the newline character. Implies -x and -L 1.
Here is a simple example:
mkdir test && cd test && touch tmp
ls | xargs -I '{}' cp '{}' '{}'
Returns an Error cp: tmp and tmp are the same file
The xargs utility will duplicate its input stream to replace all placeholders in its argument if you use the -I flag:
$ echo hello | xargs -I XXX echo XXX XXX XXX
hello hello hello
The placeholder XXX (may be any string) is replaced with the entire line of input from the input stream to xargs, so if we give it two lines:
$ printf "hello\nworld\n" | xargs -I XXX echo XXX XXX XXX
hello hello hello
world world world
You may use this with your tool:
$ generate_args | xargs -I XXX TOOL XXX -o XXX
Where generate_args is a script, command or shell function that generates arguments for your tool.
The reason
each <arg1_value> | TOOL $1 -o $1
did not work, apart from each not being a command that I recognise, is that $1 expands to the first positional parameter of the current shell or function.
The following would have worked:
set - "arg1_value"
TOOL "$1" -o "$1"
because that sets the value of $1 before calling you tool.
You can re-run a shell to perform variable expansion, with sh -c. The -c takes an argument which is command to run in a shell, performing expansion. Next arguments of sh will be interpreted as $0, $1, and so on, to use in the -c. For example:
sh -c 'echo $1, i repeat: $1' foo bar baz will print execute echo $1, i repeat: $1 with $1 set to bar ($0 is set to foo and $2 to baz), finally printing bar, i repeat: bar
The $1,$2...$N are only visible to bash script to interpret arguments to those scripts and won't work the way you want them to. Piping redirects stdout to stdin and is not what you are looking for either.
If you just want a one-liner, use something like
ARG1=hello && tool $ARG1 $ARG1
Using GNU parallel to use STDIN four times, to print a multiplication table:
seq 5 | parallel 'echo {} \* {} = $(( {} * {} ))'
Output:
1 * 1 = 1
2 * 2 = 4
3 * 3 = 9
4 * 4 = 16
5 * 5 = 25
One could encapsulate the tool using awk:
$ echo arg1 arg2 | awk '{ system("echo TOOL " $1 " -o " $2) }'
TOOL arg1 -o arg2
Remove the echo within the system() call and TOOL should be executed in accordance with requirements:
echo arg1 arg2 | awk '{ system("TOOL " $1 " -o " $2) }'
Double up the data from a pipe, and feed it to a command two at a time, using sed and xargs:
seq 5 | sed p | xargs -L 2 echo
Output:
1 1
2 2
3 3
4 4
5 5