Avoid interactive mode in shell script - shell

There is an interactive shell console, I can get into it, run specific set of commands inside the console and exit from it.
Now I want to write a bash script that connects to an interactive shell console and runs my commands silently, exits at the end without any interaction. This means I want to have everything automated in a non-interactive way. Any ideas how can I achieve this?
I am trying something like, say, blabla shell is the interactive console here, it always bring me to the interactive mode :(
/usr/bin/blabla shell << EOF
do A,
do B,
do C
quit
EOF
I have a long/specific version of this question can be found here ->
Configure flume in shell/bash script - avoid interactive flume shell console

Closing stdin should do the trick:
exec <&-

The expect command if your friend. It can emulate interactive communication with other commands even in very sophisticated way.
From man expect:
Expect is a program that "talks" to other interactive programs according to a script.

You can try putting the commands you would input in the interactive prompt into a file, then run the command like:
command < file

Maybe the Secure SHell, ssh does what you need. It requires that the "remote" machine is configured as an SSH server. I use it regularly to run commands on other hosts, such as
ssh user#host command

Related

Running a script in PowerBroker

I'm trying to script my commands that are run inside the pbrun shell. I've tried executing through a normal script, but that doesn't work because, to my understanding, pbrun is executed in its won subshell, making it hard, if not impossible, to pass commands to.
The only solution I'm thinking might work is that if I have a input/output text processor that listens to the terminal and responds accordingly.
I was able to send commands to the standard input of pbrun:
echo 'echo $HOSTNAME' | pbrun bash

How to run shell script on VM indefinitely?

I have a VM that I want running indefinitely. The server is always running but I want the script to keep running after I log out. How would I go about doing so? Creating a cron job?
In general the following steps are sufficient to convince most Unix shells that the process you're launching should not depend on the continued existence of the shell:
run the command under nohup
run the command in the background
redirect all file descriptors that normally point to the terminal to other locations
So, if you want to run command-name, you should do it like so:
nohup command-name >/dev/null 2>/dev/null </dev/null &
This tells the process that will execute command-name to send all stdout and stderr to nowhere (instead of to your terminal) and also to read stdin from nowhere (instead of from your terminal). Of course if you actually have locations to write to/read from, you can certainly use those instead -- anything except the terminal is fine:
nohup command-name >outputFile 2>errorFile <inputFile &
See also the answer in Petur's comment, which discusses this issue a fair bit.

How to prevent PuTTY shell from auto-exit after executing command from batch file in Windows?

I have written a batch file like this:
Start putty.exe -ssh 172.17.0.52 -l root -m dummy.txt
Then in dummy.text I have written this command:
avahi-daemon --no-drop-root -D
export XVHMI_USERCONFIG_PATH=/home/UserProfileConfig
export XDG_RUNTIME_DIR=/tmp
cd /opt/bosch/airis/bin
When I run the .bat file, PuTTY starts, commands execute (hopefully, not sure) and it exits.
How to keep that window open?
I have googled for the same, but no solid help. I read on stack overflow itself that we need to define something in txt file, but what and most importantly how?
The SSH session closes (and PuTTY with it) as soon as the command finishes. Normally the "command" is shell. As you have overridden this default "command" and yet you want to run the shell nevertheless, you have to explicitly execute the shell yourself:
avahi-daemon ... ; /bin/bash
Also as use of -m switch implies a non-interactive terminal, you probably want to force an interactive terminal back using -t switch.
Though, I'm not really sure if you want to execute shell or if you just want to see your command output. If the latter, did you consider using plink? It's console terminal client from PuTTY package. Being console application, it inherits console of parent batch file, and you can pause the batch console from closing using pause command, if needed.
Another option (both for PuTTY and plink) is to pause on remote side. E.g. Using read command.
avahi-daemon ... ; read
As suggested by Martin I tried this step:
putty.exe -ssh 172.17.0.52 -l root -m dummy.txt -t
added /bin/bash at the end of commands in dummy.txt
It worked for me. Please note, you have to follow both the steps as mentioned above.
This way you can keep the session alive and can manually execute further commands.

How to ssh into a shell and run a script and leave myself at the prompt

I am using elastic map reduce from Amazon. I am sshing into hadoop master node and executing a script like.
$EMR_BIN/elastic-mapreduce --jobflow $JOBFLOW --ssh < hivescript.sh . It sshes me into the master node and runs the hive script. The hivescript contains the following lines
hive
add jar joda-time-1.6.jar;
add jar EmrHiveUtils-1.2.jar;
and some commands to create hive tables. The script runs fine and creates the hive tables and everything else, but comes back to the prompt from where I ran the script. How do I leave it sshed into hadoop master node at the hive prompt.
Consider using Expect, then you could do something along these lines and interact at the end:
/usr/bin/expect <<EOF
spawn ssh ... YourHost
expect "password"
send "password\n"
send javastuff
interact
EOF
These are the most common answers I've seen (with the drawbacks I ran into with them):
Use expect
This is probably the most well rounded solution for most people
I cannot control whether expect is installed in my target environments
Just to try this out anyway, I put together a simple expect script to ssh to a remote machine, send a simple command, and turn control over to the user. There was a long delay before the prompt showed up, and after fiddling with it with little success I decided to move on for the time being.
Eventually I came back to this as the final solution after realizing I had violated one of the 3 virtues of a good programmer -- false impatience.
Use screen / tmux to start the shell, then inject commands from an external process.
This works ok, but if the terminal window dies it leaves a screen/tmux instance hanging around. I could certainly try to come up with a way to just re-attach to prior instances or kill them; screen (and probably tmux) can make it die instead of auto-detaching, but I didn't fiddle with it.
If using gnome-terminal, use its -x or --command flag (I'm guessing xterm and others have similar options)
I'll go into more detail on problems I had with this on #4
Make a bash script with #!/bin/bash --init-file as the shebang; this will cause your script to execute, then leave an interactive shell running afterward
This and #3 had issues with some programs that required user interaction before the shell is presented to them. Some programs (like ssh) it worked fine with, others (telnet, vxsim) presented a prompt but no text was passed along to the program; only ctrl characters like ^C.
Do something like this: xterm -e 'commands; here; exec bash'. This will cause it to create an interactive shell after your commands execute.
This is fine as long as the user doesn't attempt to interrupt with ^C before the last command executes.
Currently, the only thing I've found that gives me the behavior I need is to use cmdtool from the OpenWin project.
/usr/openwin/bin/cmdtool -I 'commands; here'
# or
/usr/openwin/bin/cmdtool -I 'commands; here' /bin/bash --norc
The resulting terminal injects the list of commands passed with -I to the program executed (no parms means default shell), so those commands show up in that shell's history.
What I don't like is that the terminal cmdtool provides feels so clunky ... but alas.

Script: SSH command execute and leave shell open, pipe output to file

I would like to execute a ssh command and pipe the output to a file.
In general I would do:
ssh user#ip "command" >> /myfile
the problem is that ssh close the connection once the command is executed, however - my command sends the output to the ssh channel via another programm in the background, therefore I am not receiving the output.
How can I treat ssh to leave my shell open?
cheers
sven
My understanding is that command starts some background process that perhaps will write some output to the terminal later. If command terminates before that the ssh session will be terminated and there will be no terminal for the background program to write to.
One simple and naive solution is to just sleep long enough
ssh user#ip "command; sleep 30m" >> /myfile
A better solution than sleep would be to wait for the background process(es) to finish in some more intelligent way, but that is impossible to say without further details.
Something more powerful than bash would be Python with Paramiko and PyExpect.

Resources