Cancel all waiting jobs with qsub - cluster-computing

I have submitted a lot of jobs to qsub and I want to cancel all those not currently running. Is there a way to do this without knowing all the jobIDs?
The answer in this question prompted me to try
qselect -u username -s qw | xargs qdel
but this did not work, and I do not want to accidentally delete my currently running jobs.

The below combination worked for me, the output is dependant on the state of the job.
qstat -u | grep "state of the job" | awk 'print{ $1 }' | tr '\n' ' ' | xargs qdel
qstat -u : displays all the jobs with respect the user.
grep "state of the job" : Returns the line containing the state of the job. (r , qw and so on)
awk 'print{ $1 }' : Print the first column. (In my case 1st column is Job ID)
tr '\n' ' ' : Substitute the new line character with space.
xargs qdel : Delete the jobs.

Related

How to pass a result from a Shell Script to the next step inside Automator

I am using automator to create a Finder service to get the length of all selected videos and display on a dialog.
So, the service will work like this:
I select a bunch of videos
I right click and run the service.
I have found this Bash script on the web that works perfectly.
times=()
for f in *.mov; do
_t=$(ffmpeg -i "$f" 2>&1 | grep "Duration" | grep -o " [0-9:.]*, " | head -n1 | tr ',' ' ' | awk -F: '{ print ($1 * 3600) + ($2 * 60) + $3 }')
times+=("$_t")
done
echo "${times[#]}" | sed 's/ /+/g' | bc
I am trying to adapt that for automator. So, my service so far is equal to this:
I have a first step that receives the movie files from Finder and passes to this Run Shell Script
times=()
for f in "$#"; do
_t=$(ffmpeg -i "$f" 2>&1 | grep "Duration" | grep -o " [0-9:.]*, " | head -n1 | tr ',' ' ' | awk -F: '{ print ($1 * 3600) + ($2 * 60) + $3 }')
times+=("$_t")
done
total="${times[#]}" | sed 's/ /+/g' | bc
I was forced to change the for loop to this
for f in "$#"; do
I understand this is how automator enumerates all files received. Files are received as arguments.
I have changed the last line to
total="${times[#]}" | sed 's/ /+/g' | bc
To create a variable called total that can hold the total number of seconds of all videos.
Now I need to pass this variable to the next step and display it on a dialog.
Two questions:
how do I do that?
are the changes I did correct?
thanks
Yes, changing the for loop in your shell script from:
for f in *.mov; do
to
for f in "$#"; do
is correct. The $# is all the parameters passed to the shell script, which in your scenario will be the pathname of each selected movie file(s).
Now I need to pass this variable to the next step and display it on a dialog
To achieve this you need to:
echo the total at the end of the shell script. So change the last line in your second example shell script to the following:
times=()
for f in "$#"; do
_t=$(ffmpeg -i "$f" 2>&1 | grep "Duration" | grep -o " [0-9:.]*, " | \
head -n1 | tr ',' ' ' | awk -F: '{ print ($1 * 3600) + ($2 * 60) + $3 }')
times+=("$_t")
done
echo "${times[#]}" | sed 's/ /+/g' | bc # <-- change last line to this
Next in Automator add a Run AppleScript action after your current Run Shell Script action. To find the Run AppleScript action in Automator you can:
Select Library at the top of the panel/column on the left:
In the search field type: Run AppleScript and drag the Run AppleScript action into the canvas area below your current Run Shell Script action.
Enter the following AppleScript into the newly added Run AppleScript action:
on run {totalDuration}
set dialogText to (totalDuration as text) & " seconds"
tell application "Finder" to display dialog dialogText with title ¬
"Total Duration" buttons {"OK"} default button 1 with icon 1
end run
Example Automator Workflow:
The completed canvas area of your Automator Service/Workflow should now appear something like this:
Note:
I don't have the ffmpeg utility available on the mac I'm currently using, therefore the shell script shown in the screenshot above uses the built-in mdls utility to obtain the duration of each move instead.
Here is that code:
total_duration=0
for f in "$#"; do
duration=$(mdls -name kMDItemDurationSeconds -raw -nullMarker 0 "$f")
total_duration=$(echo "$total_duration" + "$duration" | bc)
done
echo "$total_duration"
The other minor difference in that screenshot is the code shown in the Run AppleScript action. This just does some rounding which is probably not necessary given your shell script that you want to use. Using the AppleScript shown in the aforementioned point number 3 should be OK.

Parsing named parameter values using AWK

I am trying to come up with a script/alias that would quickly give me the list of processes run by an application. The parameters used to initiate the process by the application are named parameters and not positional
I need to extract the parameter values of -u, -s and -svn
$ ps -ef | grep pmdtm | grep -v grep
infa_adm 24581 31146 0 Oct24 ? 00:09:28 pmdtm -PR -gmh dhvifoapp03
-gmp 6015 -guid ddcbd7ab-2ed0-4696-aea3-01573968b1bc -rst 300
-s Address_Validator:wf_AddressValidator.s_m_AddressValidatorS
-dn Session task instance [s_m_AddressValidatorS] -c pmserver.cfg
-run 68_4262_654212_4_0_0_0_3263_77_2_2018_10_24___13_32_47_182
-u Administrator -uns Native -crd rlVuBI4mUFi1V/7/jyrD6f9dMurwD9Yxddio6KDy/
zwlzM5rRDMeV766VoSBqb3Snjlvu849sTXlWpJ8WjzPomNOF4U87H7x5oy
JKbtxVg/vjR6gPwWwVSdEHvPjlpwSKPcuDx6glCbB1ksrvKCAzRsW1BTlP
GOfQbnd1ptnkO83iY14k4LUpJlx8+upBhwSxk9a0TPD44byO+/4Qhe7Mg==
-svn Int01_dev -dmn Domain_dev
-nma https://DHVIFOAPP03.RENTERS-CHOICE-INC.COM:6005
-nmc w/Yt3IIMbmBQf+NnN1CAKmq5ab01nxZTJEA/YCf96Pb5zT9K9VFBO4+Nvqt
FuF8gzvqf/qHbw2tcXk4DnNP4m5vJvuEhxe9vQCN8pmpJytiZKV9Np7rBbapVzra
9TEOQVm9webRg8JZB70MQryVjQlGkJDpRs9cdOCXAu1aFhNE6LNF+
c5qhLdOz/vWCI3I2 -sid 3
-hhn dhvifoapp04.renters-choice-inc.com -hpn 15555
-hto 60 -rac 0 -SSL
-rsn RAC_dev ServiceResilienceTimeout=300
I am able to extract it for a single field using the following command, but how do I get multiple values?
$ echo "List of running jobs ==> "; ps -ef | grep pmdtm | grep -v grep | awk -F"-s " "{print \$2}"|awk -F" " "{print \$1}"
List of running jobs ==>
Address_Validator:wf_AddressValidator.s_m_AddressValidatorS
Desired output =
List of running jobs ==>
Address_Validator:wf_AddressValidator.s_m_AddressValidatorS | Administrator | Int01_dev
You can do multiple "OR" expressions in grep with something like this:
grep -E "^-s|^-u|^-svn" < file.txt
The above will only print out the lines that start out with -s, -u or -svn. Based on that, the following command does exactly what you want:
echo "List of running jobs ==> " $(ps -ef | grep pmdtm | grep -v grep | grep -E "^-s|^-u|^-svn" | awk '{ print $2 " |" }')
When running your contents on the post with the above command, I get this output:
List of running jobs ==> Address_Validator:wf_AddressValidator.s_m_AddressValidatorS | Administrator | Int01_dev |
You get a trailing | at the end, but you can trim that out separately.
Updated:
After your comment below, updated the command to do exactly what you need.
echo -e "List of running jobs ==> \n " $(ps -ef | grep pmdtm | grep -v grep | awk 'BEGIN { RS = " -"} $1 ~ /^s$|^u$|^svn$/ { print $2,"|"}')
It does assume a couple of things:
All the named parameters will have non-empty named parameters. Otherwise, it will simply output a blank.
That all the named parameters starts with a -, immediately followed by the parameter itself.

Oneline file-monitoring

I have a logfile continously filling with stuff.
I wish to monitor this file, grep for a specific line and then extract and use parts of that line in a curl command.
I had a look at How to grep and execute a command (for every match)
This would work in a script but I wonder if it is possible to achieve this with the oneliner below using xargs or something else?
Example:
Tue May 01|23:59:11.012|I|22|Event to process : [imsi=242010800195809, eventId = 242010800195809112112, msisdn=4798818181, inbound=false, homeMCC=242, homeMNC=01, visitedMCC=238, visitedMNC=01, timestamp=Tue May 12 11:21:12 CEST 2015,hlr=null,vlr=4540150021, msc=4540150021 eventtype=S, currentMCC=null, currentMNC=null teleSvcInfo=null camelPhases=null serviceKey=null gprsenabled= false APNlist: null SGSN: null]|com.uws.wsms2.EventProcessor|processEvent|139
Extract the fields I want and semi-colon separate them:
tail -f file.log | grep "Event to process" | awk -F'=' '{print $2";"$4";"$12}' | tr -cd '[[:digit:].\n.;]'
Curl command, e.g. something like:
http://user:pass#www.some-url.com/services/myservice?msisdn=...&imsi=...&vlr=...
Thanks!
Try this:
tail -f file.log | grep "Event to process" | awk -F'=' '{print $2" "$4" "$12; }' | tr -cd '[[:digit:].\n. ]' |while read msisdn imsi vlr ; do curl "http://user:pass#www.some-url.com/services/myservice?msisdn=$msisdn&imsi=$imsi&vlr=$vlr" ; done

Create a histogram or frequency list of most popular commands used in bash session

Is there an active way to collect the most frequently used commands used in a bash session?
If not, by what means can I start to write a script or run a background process to achieve this?
For example I would have a report I could generate in a session that would look like
cd 25%
ls 40%
cat 35%
This one would show output near intended format:
history | awk '($2 ~ /^[[:alnum:]]+$/) { ++a[$2]; t = length($2); if (t > l) l = t; } END { for (i in a) printf("%s%" (l - length(i) + 1) "s%5.2f%%\n", i, " ", (a[i] * 100 / NR)); }'
Example output:
...
cd 6.00%
ls 12.00%
cat 1.60%
...
You could also sort it with ... | sort -n -k2 or ... | sort -n -k2 -r.
You can try something like the following
history | cut -f1 -d' ' | sort | uniq -c | sort -n
If you frequently use pipes like the one above, you will probably need to write a bash parser to get also the other commands than just history.
You can parse the output of the history command, and count occurrences of the command field.
history | awk '{print $4}' | sort | uniq -c | sort -n
Will print a list of executed commands and the amount of times they were executed. Then, you can fetch the total amount of commands executed with history | wc -l, and then perform the calculations.

sed, environment variable and date problem

I want to add a timestamp to server events and store the result in a log.
My first idea was :
( ./runServer.sh ) | sed "s/.*/`date +%s` & /" | xargs -0 >Server.log 2>&1 &
But it seems sed never reevaluates the date, so all events get the same timestamp.
Now I'm trying to get around that using environment variable but I can't find a proper way to do it.
I have this obviously wrong line below :
( ./runServer.sh ) | xargs -0 'export mydate=`date +%s` ; sed "s/.*/$mydate & /"' >Server.log 2>&1 &
Any hints? Thanks.
Try this:
<command> | awk '{ print strftime("%Y-%m-%d %H:%M:%S"), $0; }'
Sauce: Is there a Unix utility to prepend timestamps to lines of text?
do it step by step, and use $() instead of backticks. Try this, not tested.
timestamp=$(date +%s)
./runServer.sh | sed "s/.*/$timestamp & /" | .......

Resources