How to avoid series of commands running under sql plus using shell script - shell

I have created a script which runs a series of other instances in shell using the while command. But the problem is that one of the job is connecting to sql plus and running the remaining command under sqlplus.
For ex. my input file --> job.txt
job1
job2
job3
Now, using the below script I am calling job one by one. So, until the present job is finished next job wont start. But the catch comes when the job tries to connect the sql plus. After it connects to sql plus, the process id of the current job gets complete and remaining jobs run as the sql statement in unix env.
while read line;
do
$job1
done < job.txt;
Error message I am getting in sqlplus after current instance exits.
SP2-0042: unknown command
How can I avoid bringing the jobs under sqlplus.

Related

Shell Script executing Oracle PL/SQL code

I'm calling a PL/SQL code from shell script which takes time to finish, but after completion of PL/SQL code the shell script session is not terminated and also still shows as active when is type ps -aef|grep 'SQL*PLUS'. Any suggestion why the shell script session is not terminated.
Note : PL/SQL code has no issue and ran successfully. I have used cron job to run the shell script.
The shell script is terminated when Data is less to process.

HBase export task mysteriously stopped logging to output file

I recently attempted to do an export of a table from an HBase instance using a 10 data node Hadoop cluster. The command line looked like the following:
nohup hbase org.apache.hadoop.hbase.mapreduce.Export documents /export/documents 10 > ~/documents_export.out &
As you can see, I nohup the process so it wouldn't prematurely die when my SSH session closed, and I put the whole thing in the background. To capture the output, I directed it to a file.
As expected, the process started to run and in fact ran for several hours before the output mysteriously stopped in the file I was outputting to. It stopped at about 31% through the mapping phase of the mapreduce job being run. However, per Hadoop, the mapreduce job itself was still going and in fact was working to completion the next morning.
So, my question is why did output stop going to my log file? My best guess is that the parent HBase process I invoked exited normally when it was done with the initial setup for the mapreduce job involved in the export.

Why does scheduling Spark jobs through cron fail (while the same command works when executed on terminal)?

I am trying to schedule a spark job using cron.
I have made a shell script and it executes well on the terminal.
However, when I execute the script using cron it gives me insufficient memory to start JVM thread error.
Every time I start the script using terminal there is no issue. This issue comes when the script starts with cron.
Kindly if you could suggest something.

In a Linux script, is it possible to execute multiple commands in the same process?

I have a script that contains:
db2 connect to user01
db2 describe indexes for table table_desc
What I figure is happening is the process that executes the first line is different from the process that runs the second line. This means that the process that executes the first line gets the connection while the second process that runs the second line has no connection at all. This is verified because I get an error at the second line saying that there exists no database connection.
Is it possible to have the same process run both commands? Or at least a way to "join" the first process to the second?
If you want both instructions to run in the same process you need to write them to a script:
$ cat foo.db2
connect to user01
describe indexes for table table_desc
and run that script in the db2 interpreter:
db2 -f foo.db2
A Here Document might work as well:
db2 <<EOF
connect to user01
describe indexes for table table_desc
EOF
I can't test that, though, since I currently don't have a DB2 on Linux at hand.

DATASTAGE: how to run more instance jobs in parallel using DSJOB

I have a question.
I want to run more instance of same job in parallel from within a script: I have a loop in which I invoke jobs with dsjob and without option "-wait" and "-jobstatus".
I want that jobs completed before script termination, but I don't know how to verify if job instance terminated.
I though to use wait command but it is not appropriate.
Thanks in advance
First,you should assure job compile option "Allow Multiple Instance" choose.
Second:
#!/bin/bash
. /home/dsadm/.bash_profile
INVOCATION=(1 2 3 4 5)
cd $DSHOME/bin
for id in ${INVOCATION[#]}
do
./dsjob -run -mode NORMAL -wait test demo.$id
done
project -- test
job -- demo
$id -- invocation id
the two line in shell scipt:guarantee the environment path can work.
Run the jobs like you say without the -wait, and then loop around running dsjob -jobinfo and parse the output for a job status of 1 or 2. When all jobs return this status, they are all finished.
You might find, though, that you check the status of the job before it actually starts running and you might pick up an old status. You might be able to fix this by first resetting the job instance and waiting for a status of "Not running", prior to running the job.
Invoke the jobs in loop without wait or job-status option
after your loop , check the jobs status by dsjob command
Example - dsjob -jobinfo projectname jobname.invocationid
you can code one more loop for this also and use sleep command under that
write yours further logic as per status of the jobs
but its good to create Job Sequence to invoke this multi-instance job simultaneously with the help of different invoaction-ids
create a sequence job if these are in same process
create different sequences or directly create different scripts to trigger these jobs simultaneously with invocation- ids and schedule in same time.
Best option create a standard generalized script where each thing will be getting created or getting value as per input command line parameters
Example - log files on the basis of jobname + invocation-id
then schedule the same script for different parameters or invocations .

Resources