how to mark "executed command" for custom script extension in shell script given - shell

How to mark a configured shell script to executed command in Azure VM Custom Script Extension ?
It is running the shell commands in configured CSE but it has not ending up the script due to one of my command. My command will be running a small HTTP listener kind of server.
If make "exit 0" will mark CSE as command executed and so it will exits in my powershell script where i encountered this CSE.

If my understanding is right. You could run HTTP listener in the background and exit your script. Just use below:
nohup <bash shell> &
man nohup
nohup - run a command immune to hangups, with output to a non-tty
& to the command line to run in the background:

Related

jobs command result is empty when process is run through script

I need to run rsync in background through shell script but once it has started, I need to monitor the status of that jobs through shell.
jobs command return empty when its run in shell after the script exits. ps -ef | grep rsync shows that the rsync is still running.
I can check the status through script but I need to run the script multiple times so it uses a different ip.txt file to push. So I can't have the script running to check jobs status.
Here is the script:
for i in `cat $ip.txt`; do
rsync -avzh $directory/ user#"$i":/cygdrive/c/test/$directory 2>&1 > /dev/null &
done
jobs; #shows the jobs status while in the shell script.
exit 1
Output of jobs command is empty after the shell script exits:
root#host001:~# jobs
root#host001:~#
What could be the reason and how could I get the status of jobs while the rsync is running in background? I can't find an article online related to this.
Since your shell (the one from which you execute jobs) did not start rsync, it doesn't know anything about it. There are different approaches to fixing that, but it boils down to starting the background process from your shell. For example, you can start the script you have using the source BASH command instead of executing it in a separate process. Of course, you'd have to remove the exit 1 at the end, because that exits your shell otherwise.

Running shell script commands sequentially in Jenkins

In Jenkins, I have created a job which runs many shell script commands:
command1
command2
...etc
command1 is an ssh command which calls a shell script file on another server machine. I have to wait until it is finished, and AFTER it, command2 should come.
So, how can I make sure that the script file on the other machine, started by command1, has already finished its jobs, when in the Jenkins job the next command (command2) is started?
Or, alternatively,how can I make sure that command2 won't be started until the shell script on the other machine (started by command1) has already finished?
You can check out "How to send many commands to shell and wait for the command behind ends" in order to chain commands and wait for their completion.
When you execute a command through an ssh session, you might have to wrap that command in a script able to loop/wait for the command completion.
See an example in "How can I make ssh wait until the command exits?".
Or (a simpler wraper): How do I know when a command run over ssh has finished?
#/bin/bash
$#
echo "==== Command Output Finished ===="
look for the string ==== Command Output Finished ==== in your I/O routines to determine where the boundary between command outputs are.
Or you can try isolate those commands in their own Jenkins shell build step.
(Not a different job, just a different build step within the same job)

shell script adjusment - not working fine with cron

i used this bash script to check services running or not if running the script will exit if else it will run another script which gona execute some commands
and once done it will be exit
my issue is when i run my script manuly its works fine but when i run it with cron its not running and not executed correctly here is my script
#!/bin/sh
SERVICE='loop2.sh'
if ps ax | grep -v grep | grep $SERVICE > /dev/null
then
echo "$SERVICE service running, everything is fine"
else
/home//www/loop2.sh
fi
any adjust to my script to be working fine in cron
You're not being very specific. What error are you seeing ?
Note that processes run under cron with a cut-down environment. In particular environment variables such as PATH will be much reduced from your interactive shell.
log your scripts stdout/stderr e.g. myscript 2>&1 >/tmp/script.log
check your environment is as expected via the env command in your script
does this script really do what you want ? And interact with cron how you wish ? If your service isn't running you spawn a new one, but I'd expect you to put it in the background, thus making it a daemon and not a (grand)child of the cron process
is your script executable by whichever user it's executed by under cron ?

Avoid interactive mode in shell script

There is an interactive shell console, I can get into it, run specific set of commands inside the console and exit from it.
Now I want to write a bash script that connects to an interactive shell console and runs my commands silently, exits at the end without any interaction. This means I want to have everything automated in a non-interactive way. Any ideas how can I achieve this?
I am trying something like, say, blabla shell is the interactive console here, it always bring me to the interactive mode :(
/usr/bin/blabla shell << EOF
do A,
do B,
do C
quit
EOF
I have a long/specific version of this question can be found here ->
Configure flume in shell/bash script - avoid interactive flume shell console
Closing stdin should do the trick:
exec <&-
The expect command if your friend. It can emulate interactive communication with other commands even in very sophisticated way.
From man expect:
Expect is a program that "talks" to other interactive programs according to a script.
You can try putting the commands you would input in the interactive prompt into a file, then run the command like:
command < file
Maybe the Secure SHell, ssh does what you need. It requires that the "remote" machine is configured as an SSH server. I use it regularly to run commands on other hosts, such as
ssh user#host command

How do I launch a program inside a shell script and have the shell script continue, even though the program remains open

I am using bash on Ubuntu. I would like to have a shell script open a program and continue on to the next line of the shell script, even though the program has not terminated.
Adding an & to a command places it in background.
example:
/path/to/foo
/path/to/bar # not executed untill foo is done
/path/to/foo & # in background
/path/to/bar & # executes as soon as foo is started
Read more about job-control here and here
Use something like this (my-long-running-process &) . This will launch your script as a separate process in the background.
You must run the process in the background, but you must enable job-control first. Otherwise, you cannot kill or bring the process to foreground if desired.
To enable job-control, execute:
set -m
To run some task in the background, execute:
task &
To manipulate the background task, use the jobspec syntax (%[n]). For example, to kill the last launched process, execute:
kill %
Note that enabling job-control is required only if you're actually running a script (as stated in the question). If running interactively, job-control is already enabled by default.
The manpage for bash has much more information in the JOB CONTROL section.
http://ubuntuforums.org/showthread.php?t=1657602
It looks like all you have to do is add a & at the end of the line.

Resources