I have a batch file which looks like this, executing the scripts in parallel:
start sqlplus user/pwd#tnsid #myscript1.sql &
start sqlplus user/pwd#tnsid #myscript2.sql &
start sqlplus user/pwd#tnsid #myscript3.sql
What would be the way to wait until the three of them are finished and then execute yet another command:
sqlplus user/pwd#tnsid #myscript4.sql
Related
I'm calling a PL/SQL code from shell script which takes time to finish, but after completion of PL/SQL code the shell script session is not terminated and also still shows as active when is type ps -aef|grep 'SQL*PLUS'. Any suggestion why the shell script session is not terminated.
Note : PL/SQL code has no issue and ran successfully. I have used cron job to run the shell script.
The shell script is terminated when Data is less to process.
In a unix machine (solaris), after i entered user oracle like so:
su - oracle
I initiated a program via command line that does the following:
for i in $(seq 1 900); do while true ; do printf "select * from dual;\n" | sqlplus <user>/<password>; done &>/dev/null & done
I closed the terminal and the program keeps running in the background, as expected.
How do i terminate the program if i don't have the program pid?
You can get the PID of that job with $!, write it out into some file and access it there later.
To kill all processes with a command of sqlplus, try:
pkill sqlplus
In Jenkins, I have created a job which runs many shell script commands:
command1
command2
...etc
command1 is an ssh command which calls a shell script file on another server machine. I have to wait until it is finished, and AFTER it, command2 should come.
So, how can I make sure that the script file on the other machine, started by command1, has already finished its jobs, when in the Jenkins job the next command (command2) is started?
Or, alternatively,how can I make sure that command2 won't be started until the shell script on the other machine (started by command1) has already finished?
You can check out "How to send many commands to shell and wait for the command behind ends" in order to chain commands and wait for their completion.
When you execute a command through an ssh session, you might have to wrap that command in a script able to loop/wait for the command completion.
See an example in "How can I make ssh wait until the command exits?".
Or (a simpler wraper): How do I know when a command run over ssh has finished?
#/bin/bash
$#
echo "==== Command Output Finished ===="
look for the string ==== Command Output Finished ==== in your I/O routines to determine where the boundary between command outputs are.
Or you can try isolate those commands in their own Jenkins shell build step.
(Not a different job, just a different build step within the same job)
If I have a UNIX shell script which has some program on each line that needs to be run, like
#!/bin/bash
command1
command2
command3
command4
will command2 execute only after command1 execution finishes or are they run in parallel without waiting for the previous command to finish as each command is a separate process that needs to be executed.
The commands are run serially. To run them in parallel, append & to each line:
#!/bin/bash
command1&
command2&
command3&
command4&
wait
I have the following ksh script:
sqlplus usr1/pw1#DB1 #$DIR/a.sql $1 &
sqlplus usr2/pw2#DB2 #$DIR/b.sql $1 &
wait
echo "Done!"
Where $DIR is a variable with the absolute path where a.sql and b.sql are.
For some time, I've been running this script daily and it works fine.
The intentions is that both SQL*Plus sessions go to the background and
execute in parallel, and when they finish I can continue with the
following steps of the application.
Since it's not a test version anymore, I scheduled it on the crontab
to execute daily. The problem I have now is that it wont pause on
"wait" and let the sqlplus sessions finish, but it directly outputs
"Done!". In the real app, that "echo Done!" is actually a call to
another program to do some processing on a.sql's and b.sql's output.
But since it's not waiting for both sql scripts to actually finish,
the processing cannot be done.
It works absolutely perfect when I run it myself, wether I do it from
the local directory or from the root (as crontab would do it). But
when it is executed automatically by the crontab, I't doesn't stop at
wait and screws the whole thing up.
Any ideas on what might be happening? Thanks!