Linux script command terminates as soon as it started - shell

I am uisng script command to execute a binary. As soon as the execution starts , it ends the next moment. Please see the logs below
08:46:01 + script -c '/home/jenkins/workspace/slack-bot-test/s3-storage-e2e-test -kubeconfig /clusters/s3-e2e-test/kube-config-s3-e2e-test.yml -host https://xxx:23530' log.txt
08:46:01 Script started, file is log.txt
08:46:01 Script done, file is log.txt

Related

jobs command result is empty when process is run through script

I need to run rsync in background through shell script but once it has started, I need to monitor the status of that jobs through shell.
jobs command return empty when its run in shell after the script exits. ps -ef | grep rsync shows that the rsync is still running.
I can check the status through script but I need to run the script multiple times so it uses a different ip.txt file to push. So I can't have the script running to check jobs status.
Here is the script:
for i in `cat $ip.txt`; do
rsync -avzh $directory/ user#"$i":/cygdrive/c/test/$directory 2>&1 > /dev/null &
done
jobs; #shows the jobs status while in the shell script.
exit 1
Output of jobs command is empty after the shell script exits:
root#host001:~# jobs
root#host001:~#
What could be the reason and how could I get the status of jobs while the rsync is running in background? I can't find an article online related to this.
Since your shell (the one from which you execute jobs) did not start rsync, it doesn't know anything about it. There are different approaches to fixing that, but it boils down to starting the background process from your shell. For example, you can start the script you have using the source BASH command instead of executing it in a separate process. Of course, you'd have to remove the exit 1 at the end, because that exits your shell otherwise.

Bash script to run a detached loop that sequentially starts backgound processes

I am trying to run a series of tests on a remote Linux server to which I am connecting via ssh.
I don't want to have to stay logged in the ssh session during the runs -> nohup(?)
I don't want to have to keep checking if one run is done -> for loop(?)
Because of licensing issues, I can only run a single testing process at a time -> sequential
I want to keep working while the test set is being processed -> background
Here's what I tried:
#!/usr/bin/env bash
# Assembling a list of commands to be executed sequentially
TESTRUNS="";
for i in `ls ../testSet/*`;
do
MSG="running test problem ${i##*/}";
RUN="mySequentialCommand $i > results/${i##*/} 2> /dev/null;";
TESTRUNS=$TESTRUNS"echo $MSG; $RUN";
done
#run commands with nohup to be able to log out of ssh session
nohup eval $TESTRUNS &
But it looks like nohup doesn't fare too well with eval.
Any thoughts?
nohup is needed if you want your scripts to run even after the shell is closed. so yes.
and the & is not necessary in RUN since you execute the command with &.
Now your script builds the command in the for loop, but doesn't execute it. It means you'll have only the last file running. If you want to run all of the files, you need to execute the nohup command as part of your loop. BUT - you can't run the commands with & because this will run commands in the background and return to the script, which will execute the next item in the loop. Eventually this would run all files in parallel.
Move the nohup eval $TESTRUNS inside the for loop, but again, you can't run it with &. What you need to do is run the script itself with nohup, and the script will loop through all files one at a time, in the background, even after the shell is closed.
You could take a look at screen, an alternative for nohup with additional features. I will replace your testscript with while [ 1 ]; do printf "."; sleep 5; done for testing the screen solution.
The commands screen -ls are optional, just showing what is going on.
prompt> screen -ls
No Sockets found in /var/run/uscreens/S-notroot.
prompt> screen
prompt> screen -ls
prompt> while [ 1 ]; do printf "."; sleep 5; done
# You don't get a prompt. Use "CTRL-a d" to detach from your current screen
prompt> screen -ls
# do some work
# connect to screen with batch running
prompt> screen -r
# Press ^C to terminate the batch (script printing dots)
prompt> screen -ls
prompt> exit
prompt> screen -ls
Google for screenrc to see how you can customize the interface.
You can change your script into something like
#!/usr/bin/env bash
# Assembling a list of commands to be executed sequentially
for i in ../testSet/*; do
do
echo "Running test problem ${i##*/}"
mySequentialCommand $i > results/${i##*/} 2> /dev/null
done
Above script can be started with nohup scriptname & when you do not use screen or simple scriptname inside the screen.

How to run "script -a xxx.txt" properly in a shell script?

I have a shell script and I want the session text to be saved automatically every time the script runs, so I included the command "script -a output.txt" at the beginning of my script. However, the script stops running after this line of code, which only displays a "bash-3.2$" on the screen and won't go on. Any ideas?
Thanks in advance!
The problem is script starts a separate sub-shell than the one that is running the actual script, to club them together. Use the -c flag in script
-c, --command command
Run the command rather than an interactive shell. This makes
it easy for a script to capture the output of a program that
behaves differently when its stdout is not a tty.
Just do,
script -c 'bash yourScript.sh' -a output.txt

Scripts with nohup inside don't exit correctly

We have script which do some processing and triggers a job in background using nohup. When we schedule this script from Oracle OEM (or it can be any scheduler job), i see the following error and show status as failed but the script actually finished without issue. How to exit the script correctly when backup ground job is started with nohup?
Remote operation finished but process did not close its stdout/stderr
file: test.sh
#!/bin/bash
# do some processing
...
nohup ./start.sh 2000 &
# end of the script
By executing start.sh in this manner you are allowing it to claim partial ownership of test.sh's output file descriptors (stdout/stderr). So whereas when most bash scripts exit, their file descriptors are closed for them (by the operating system), test.sh's file descriptors cannot be closed because start.sh still has a claim to them.
The solution is to not let start.sh claim the same output file descriptors as test.sh is using. If you don't care about its output, you can launch it like this:
nohup ./start.sh 2000 1>/dev/null 2>/dev/null &
which tells the new process to send both its stdout and stderr to /dev/null. If you do care about its output, then just capture it somewhere more meaningful:
nohup ./start.sh 2000 1>/path/to/stdout.txt 2>/path/to/stderr.txt &

Nohup command to run a script and get control immediately for next script

I have used nohup command and execute a unix shell script from background but i want to execute the next command immediately before the previous shell script gets completed , i do not want to wait until and unless the shell script gets completed.
Is there any way , i tried with nohup but i m getting this:
nohup: appending output to `nohup.out'
and not getting control to run the next command. Is there any way to exit immediatley after calling a shell script and let it run in the background , execute the next command without using CTRL+C or force shutdown.
I have used the below command but
$ nohup sh dataload.sh &
[1] 14472
$ nohup: appending output to `nohup.out'
here I am not able to get the control to execute the next command
Just put it in the background: nohup your_command &

Resources