qsub job submission with script that requires flag argument - bash

I have a script called proc.sh that has a flag option that requires an ID. To run it, I do this:
./proc.sh -s id
I am working with the TORQUE resource manager (based on OpenPBS), so I initially tried to submit this as a job with the following command, which didn't work
qsub -V -k o -l nodes=1:ppn=2,walltime=10:00:00 proc.sh -F id
I've been told that I can use the -v option, but I'm not sure how to use it properly in this case. Would this be the proper way?
qsub -V -k o -l nodes=1:ppn=2,walltime=10:00:00 -v "s=id" proc.sh

Related

How to pass parameter expansions into qsub?

I'm trying to use qsub to submit multiple parallel jobs, but I'm running into trouble with passing parameter substitutions into qsub. I'm using the -V option, but it doesn't seem to recognize what ${variable} is. Here's some code I tried running:
qsub -cwd -V -pe shared 4 -l h_data=8G,h_rt=00:10:00,highp -N bt2align3 -b y "projPath="$SCRATCH/CUTnTag/data_kayaokur2020"; sample="K4m3_rep1"; cores=8;
bowtie2 --end-to-end --very-sensitive --no-mixed --no-discordant --phred33 -I 10 -X 700
-p ${cores}
-x ${projPath}/bowtie2_index/GRCh38_noalt_analysis/GRCh38_noalt_as
-1 ${projPath}/raw_fastq/${sample}_R1.fastq.gz
-2 ${projPath}/raw_fastq/${sample}_R2.fastq.gz
-S ${projPath}/alignment/sam/${sample}_bowtie2.sam &> ${projPath}/alignment/sam/bowtie2_summary/${sample}_bowtie2.txt"
I just get an error that says "Invalid null command."
Is qsub not able to recognize parameter expansions? Is there a different syntax I should be using? Thanks.

How to ssh to a server and get CPU and memory details?

I am writing a shell script where i want to ssh to a server and get the cpu and memory details data of that displayed as a result. I’m using the help of top command here.
Script line:
ssh -q user#host -n “cd; top -n 1 | egrep ‘Cpu|Mem|Swap’”
But the result is
TERM environment variable is not set.
I had checked the same in the server by entering set | grep TERM and got result as TERM=xterm
Please someone help me on this. Many thanks.
Try using the top -b flag:
ssh -q user#host -n "cd; top -bn 1 | egrep 'Cpu|Mem|Swap'"
This tells top to run non-interactively, and is intended for this sort of use.
top need an environment. You have to add the parameter -t to get the result:
ssh -t user#host -n "top -n 1 | egrep 'Cpu|Mem|Swap'"
Got it..!! Need to make a small modification for the below script line.
ssh -t user#host -n "top -n 1 | egrep 'Cpu|Mem|Swap'"
Instead of -t we need to give -tt. It worked for me.
To execute command top after ssh’ing. It requires a tty to run. Using -tt it will enable a force pseudo-tty allocation.
Thanks stony for providing me a close enough answer!! :)

Run cassandra queries from command line

I want to execute cql queries from bash command.
[cqlsh 3.1.8 | Cassandra 1.2.19 | CQL spec 3.0.5 | Thrift protocol 19.36.2]
[root#hostname ~]# /opt/apache-cassandra-1.2.19/bin/cqlsh -k "some_keyspace" -e "SELECT column FROM Users where key=value"
I got:
cqlsh: error: no such option: -e
Options:
--version show program's version number and exit
-h, --help show this help message and exit
-C, --color Always use color output
--no-color Never use color output
-u USERNAME, --username=USERNAME
Authenticate as user.
-p PASSWORD, --password=PASSWORD
Authenticate using password.
-k KEYSPACE, --keyspace=KEYSPACE
Authenticate to the given keyspace.
-f FILE, --file=FILE Execute commands from FILE, then exit
-t TRANSPORT_FACTORY, --transport-factory=TRANSPORT_FACTORY
Use the provided Thrift transport factory function.
--debug Show additional debugging information
--cqlversion=CQLVERSION
Specify a particular CQL version (default: 3.0.5).
Examples: "2", "3.0.0-beta1"
-2, --cql2 Shortcut notation for --cqlversion=2
-3, --cql3 Shortcut notation for --cqlversion=3
Any suggestions ?
First of all, you should seriously consider upgrading. You are missing out on a lot of new features and bug fixes.
Secondly, with cqlsh in Cassandra 1.2 you can use the -f flag to specify a file containing cql statements:
$ echo "use system_auth; SELECT role,is_superuser FROM roles WHERE role='cassandra';" > userQuery.cql
$ bin/cqlsh -u aploetz -p reindeerFlotilla -f userQuery.cql
role | is_superuser
-----------+--------------
cassandra | True
(1 rows)
You can use -f to execute from a file or SOURCE once you start CQLSH. I don't think -e is a valid option with that version.
It's bit dirty and unstable, but here is the answer:
/opt/apache-cassandra-1.2.19/bin/cqlsh -k "keyspace" -f /path/to/file.cql > /path/to/output.txt
tail -2 /path/to/output.txt | head -1 > /path/to/output-value.txt

bsub option confused with job arguments

I want to submit a job to LSF using the bsub command. One of the job argument is "-P argument_1". So the overall command looks like
bsub -P project_name -n 4 -W 10:00 my_job -P argument_1
But bsub considers -P argument_1 as the project_name instead of considering as an argument of my_job.
Is there anyway to resolve this issue?
What version of LSF are you using? You can check by running lsid. Try quoting your command and see if that helps:
bsub -P project_name -n 4 -W 10:00 "my_job -P argument_1"
Use a submission script script.sh including my_job -P placeholder_arg1. Then use
sed 's/placeholder_arg1/argument_1/g' < script.sh | bsub
to replace command line argument on-the-fly before submitting the job.

How to combine 3 commands into a single process for runit to monitor?

I wrote a script that grabs a set of parameters from two sources using wget commands, stores them into a variables and then executes video transcoding process based on the retrieved parameters. Runit was installed to monitor the process.
The problem is that when I try to stop the process, runit doesnt know that only the last transcoding process needs to be stopped therefore it fails to stop it.
How can I combine all the commands in bash script to act as a single process/app?
The commands are something as follows:
wget address/id.html
res=$(cat res_id | grep id.html)
wget address/length.html
time=$(cat length_id | grep length.html)
/root/bin -i video1.mp4 -s $res.....................
Try wrapping them in a shell:
sh -c '
wget address/id.html
res=$(grep id.html res_id)
wget address/length.html
time=$(grep length.html length_id)
/root/bin -i video1.mp4 -s $res.....................
'

Resources