Run a single case test with maven in parallel [BASH] - bash

I need to run n times the same test case and count the number of times the test pass or fail. I wrote the following snippet of code:
CONCATNAME=$CLASSNAME”#”$METHODNAME
PASSTEST=0
FAILTEST=0
TOSEARCH="BUILD SUCCESS"
i=0
for i in $( seq 1 $NROUNDS )
do
echo "exec command: mvn -Dtest="$CONCATNAME" test"
message=$(mvn -Dtest=$CONCATNAME test)
if echo "$message" | grep -q "$TOSEARCH"; then
PASSTEST=$((PASSTEST+1))
else
FAILTEST=$((FAILTEST+1))
fi
done
but the execution takes several minutes to complete. Is it possible without modifying the POM.xml file to run the mvn -Dtest command in parallel?

Related

How to wait in bash till a shell script is finished?

right now I'm using this script for a program:
export FREESURFER_HOME=$HOME/freesurfer
source $FREESURFER_HOME/SetUpFreeSurfer.sh
cd /home/ubuntu/fastsurfer
datadir=/home/ubuntu/moya/data
fastsurferdir=/home/ubuntu/moya/output
mkdir -p $fastsurferdir/logs # create log dir for storing nohup output log (optional)
while read p ; do
echo $p
nohup ./run_fastsurfer.sh --t1 $datadir/$p/orig.nii \
--parallel --threads 16 --sid $p --sd $fastsurferdir > $fastsurferdir/logs/out-${p}.log &
sleep 3600s
done < /home/ubuntu/moya/data/subjects-list.txt
Instead of using sleep 3600s, as the program needs around an hour, I'd like to use wait until all processes (several PIDS) are finished.
If this is the right way, can you tell me how to do that?
BR Alex
wait will wait for all background processes to finish (see help wait). So all you need is to run wait after creating all of the background processes.
This may be more than what you are asking for but I figured I would provide some methods for controlling the number of threads you want to have running at once. I find that I always want to limit the number for various reasons.
Explaination
The following will limit concurrent threads to max_threads running at one time. I am also using the main design pattern so we have a main that runs the script with a function run_jobs that handles the calling and waiting. I read all of $p into an array, then traverse that array as we launch threads. It will either launch a thread up to 4 or wait 5 seconds, once there are at least one less than four it will start another thread. When finished it waits for any remaining to be done. If you want something more simplistic I can do that as well.
#!/usr/bin/env bash
export FREESURFER_HOME=$HOME/freesurfer
source $FREESURFER_HOME/SetUpFreeSurfer.sh
typeset max_threads=4
typeset subjects_list="/home/ubuntu/moya/data/subjects-list.txt"
typeset subjectsArray
run_jobs() {
local child="$$"
local num_children=0
local i=0
while [[ 1 ]] ; do
num_children=$(ps --no-headers -o pid --ppid=$child | wc -w) ; ((num_children-=1))
echo "Children: $num_children"
if [[ ${num_children} -lt ${max_threads} ]] ;then
if [ $i -lt ${#subjectsArray[#]} ] ;then
((i+=1))
# RUN COMMAND HERE &
./run_fastsurfer.sh --t1 $datadir/${subjectsArray[$i]}/orig.nii \
--parallel --threads 16 --sid ${subjectsArray[$i]} --sd $fastsurferdir
fi
fi
sleep 10
done
wait
}
main() {
cd /home/ubuntu/fastsurfer
datadir=/home/ubuntu/moya/data
fastsurferdir=/home/ubuntu/moya/output
mkdir -p $fastsurferdir/logs # create log dir for storing nohup output log (optional)
mapfile -t subjectsArray < ${subjects_list}
run_jobs
}
main
Note: I did not run this code since you have not provided enough information to actually do so.

Script executes but fails to increment

So I have this shell script that I think should run a given number of times, sleep then resume, and output the results to a log file
#!/bin/bash
log=/path/to/file/info.log
a=$(COMMAND1 | cut -d : -f 2)
b=$(COMMAND2 | grep VALUE| cut -c 7,8)
for i in {1..4}
do
echo "Test" $i >> $log
date >> $log
echo $a >> $log
echo "$((-113 + (($b * 2)))) VALUE" >> $log
sleep 60
done
When I run ps -ef | grep scriptname.sh it seems the script does run. Executes once then the PID is gone as if the run has completed.
I have tested the script and know that it is running and capturing the data I want. But I do not understand why its not incrementing and not sure why its ending earlier than expected.
info.log output sample
Test {1..4}
DATE IN UTC
EXPECTED VALUE OF a
EXPECTED VALUE OF b
Note that the output is literally "Test {1..4}" not "Test 1" "Test 2" Test 3" and so on, as I would expect.
I have run the script as ./scriptname.sh & and as /path/to/file/scriptname.sh &
I have read that there is a difference in running the script with sh and bash though I dont fully understand what effect that would have on the script. I am not a software person at all.
I have tried to run the script with nohup to keep it running in the background if I close the terminal. I also thought the & in the command was supposed to keep the script running in the background. Still it seems the script does not continue to run
I previously asked this question and it was closed, citing that it was similar to a post about the difference between sh and bash...but thats not my main question.
also echo "$BASH_VERSION" returns nothing, a blank line. echo "$-" returns smi, and I have no idea what that means. but bash --version returns:
BusyBox v1.17.1 (2019-11-26 10:41:00 PST) built-in shell (ash)
Enter 'help' for a list of built-in commands.
So my questions are:
If running the script with sh - is that done with ./scriptname.sh & and running the script with bash is /path/to/file/scriptname.sh &...and if so what effect does that have on how the script code is processed? that is - is using sh or bash. I do not fully understand the difference between the two
why does the script not continue to run when I close the terminal? This is my big concern. I would like to run this script hourly for a set period of time. Every time I try something and come back I get one instance in the log.
Neither brace expansion nor seq are part of the POSIX specification. Use a while loop.
log=/path/to/file/info.log
a=$(COMMAND1 | cut -d : -f 2)
b=$(COMMAND2 | grep VALUE| cut -c 7,8)
i=1
while [ "$i" -le 4 ]; do
printf 'Test %s\n' "$i"
date
printf '%s\n' "$a"
printf '%s\n' "$((-113 + (($b * 2)))) VALUE"
sleep 60
i=$((i+1))
done >> "$log"
(I suspect that you want to move the assignments to a and b inside the loop as well; right now, you are simply writing identical files to the log at each iteration.)

question on using bwait to wait for multiple bsub jobs to finish

I am new to using LSF (been using PBS/Torque all along).
I need to write code/logic to make sure all bsub jobs finish before other commands/jobs can be fired.
Here is what I have done: I have a master shell script which calls multiple other shell scripts via bsub commands. I capture the job ids from bsub in a log file and I need to ensure that all jobs get finished before the downstream shell script should execute its other commands.
Master shell script
#!/bin/bash
...Code not shown for brevity..
"Command 1 invoked with multiple bsubs" > log_cmd_1.txt
Need Code logic to use bwait before downstream Commands can be used
"Command 2 will be invoked with multiple bsubs" > log_cmd_2.txt
and so on
stdout captured from Command 1 within the Master Shell script is stored in log_cmd_1.txt which looks like this
Submitting Sample 101
Job <545> is submitted to .
Submitting Sample 102
Job <546> is submitted to .
Submitting Sample 103
Job <547> is submitted to .
Submitting Sample 104
Job <548> is submitted to .
I have used the codeblock shown below after Command 1 in the master shell script.
However, it does not seem to work for my situation. Looks like I would have gotten the whole thing wrong below.
while sleep 30m;
do
#the below gets the JobId from the log_cmd_1.txt and tries bwait
grep '^Job' <path_to>/log_cmd_1.txt | perl -pe 's!.*?<(\d+)>.*!$1!' | while read -r line; do res=$(bwait -w "done($line)");echo $res; done 1>
<path_to>/running.txt;
# the below sed command deletes lines that start with Space
sed '/^\s*$/d' running.txt > running2.txt;
# -s file check operator means "file is not zero size"
if [ -s $WORK_DIR/logs/running2.txt ]
then
echo "Jobs still running";
else
echo "Jobs complete";
break;
fi
done
The question: What's the correct way to do this using bwait within the master shell script.
Thanks in advance.
bwait will block until the condition is satisfied, so the loops are probably not neecessary. Note that since you're using done, if the job fails then bwait will exit and inform you that the condition can never be satisfied. Make sure to check that case.
What you have should work. At least the following test worked for me.
#!/bin/bash
# "Command 1 invoked with multiple bsubs" > log_cmd_1.txt
( bsub sleep 0; bsub sleep 0 ) > log_cmd_1.txt
# Need Code logic to use bwait before downstream Commands can be used
while sleep 1
do
#the below gets the JobId from the log_cmd_1.txt and tries bwait
grep '^Job' log_cmd_1.txt | perl -pe 's!.*?<(\d+)>.*!$1!' | while read -r line; do res=$(bwait -w "done($line)");echo "$res"; done 1> running.txt;
# the below sed command deletes lines that start with Space
sed '/^\s*$/d' running.txt > running2.txt;
# -s file check operator means "file is not zero size"
if [ -s running2.txt ]
then
echo "Jobs still running";
else
echo "Jobs complete";
break;
fi
done
Another way to do it. Which may is a little cleaner, is to use job arrays and job dependencies. Job arrays will combine several pieces of work that can be managed as a single job. So your
"Command 1 invoked with multiple bsubs" > log_cmd_1.txt
could be submitted as a single job array. You'll need a driver script that can launch the individual jobs. Here's an example driver script.
$ cat runbatch1.sh
#!/bin/bash
# $LSB_JOBINDEX goes from 1 to 10
if [ "$LSB_JOBINDEX" -eq 1 ]; then
# do the work for job batch 1, job 1
...
elif [ "$LSB_JOBINDEX" -eq 2 ]; then
# etc
...
fi
Then you can submit the job array like this.
bsub -J 'batch1[1-10]' sh runbatch1.sh
This command will run 10 job array elements. The driver script's environment will use the variable LSB_JOB_INDEX to let you know which element the driver is running. Since the array has a name, batch, it's easier to manage. You can submit a second job array that won't start until all elements of the first have completed successfully. The second array is submitted with this command.
bsub -w 'done(batch1)' -J 'batch2[1-10]' sh runbatch2.sh
I hope that this helps.

Exit build in Jenkins when script fails

I am running Jenkins as a CI/CD pipeline for a project. To make things easier for my self, I have created a bash script to run the tests and send coverage report, here is my bash script:
#!/bin/bash
echo $GIT_COMMIT # only needed for debugging
GIT_COMMIT=$(git log | grep -m1 -oE '[^ ]+$')
echo $GIT_COMMIT # only needed for debugging
./cc-test-reporter before-build
yarn test --coverage
./cc-test-reporter after-build -t simplecov --exit-code $? || echo “Skipping Code Climate coverage upload”
And this is how I am running it in Jenkins:
sh "jenkins/scripts/load_env_variables.sh test"
Jenkins runs the script, however when the script fails, Jenkins does not exit, rather it continues:
Any help with this please?
Use "set -e" in script.
-e Exit immediately if a command exits with a non-zero status.

Running ant in my loop causes the loop to exit after one pass

I had such success with my last question, I decided I'd try again. The bash loop below is setup to iterate through a file and send messages to ant over and over again until it runs through the end of the file. When I change the command that runs ant to simply echo the command (for testing) it runs fine. When I remove "echo" and the quotes around the command, it only runs through the script once and exits the loop gracefully. It would seem obvious to me that this has something to do with ant and perhaps an exit status, but I don't see why that would make it exit the loop instead of returning. It always returns zero, by the way.
echo "Looping through database results and sending to ant..."
# This while loop runs through pendingtxs.result and funnels them to ant
while IFS=, read txid courseid instructorid
do
echo "Beginning substitution of $1 into file..."
sed -e "s/XXXXXXXXXX/$txid/" -e "s/YYYYYYYYYY/$courseid/" -e "s/ZZZZZZZZZZ/$instructorid/" createcourse_notif.template.xml >temp.xml
echo "Substitution complete."
echo "Sending the temp.xml to ant..."
/xncpkgs/ant/bin/ant sendMessage -Dsend.destination=SmsQueue -Dmessage.file=temp.xml
antReturnCode=$?
echo "ANT: Return code is: \""$antReturnCode"\""
echo "Ant is done"
echo "Adding the xml to log.txt for later analysis"
cat temp.xml >> log.txt
echo "Removing temp.xml"
rm temp.xml
echo "Submission of $txid complete."
done < pendingtxs.result
Cheers,
Stefano
ant is probably consuming stdin. Try running ant … < /dev/null

Resources