Ruby net/ssh is not running bash commands in background - ruby

I am having a hard time running bash commands in background from a Ruby script.
For this question, I am using a simplified example.
This is how commands are working as expected when I run them from PuTTY.
running basch commands in background
(click to see the picture, because StackOverlow is not allowing me to show pictures yet)
Now, I try to replicate this from Ruby, using this little script shown below:
ruby script
(click to see the picture, because StackOverlow is not allowing me to show pictures yet)
This is the outout I get when I run such a Ruby script:
script output
(click to see the picture, because StackOverlow is not allowing me to show pictures yet)
For your analysis, here is the transcription of the Ruby script:
require 'net/ssh'
net_remote_ip = '74.****122'
ssh_username = 'bots'
ssh_password = 'San*****'
get_ssh_port = '22'
ssh = Net::SSH.start(net_remote_ip, ssh_username, :password => ssh_password, :port => get_ssh_port)
s = "bash --login -c 'sleep 600' &"
print "run (#{s})... "
stdout = ssh.exec!(s)
puts "done (#{stdout.strip})"
ssh.close
exit(0)

You need to redirect both stdout and stderr somewhere else than the terminal, otherwise exec! will hang waiting for process output.
This is the simplest solution I found:
ssh.exec!("sleep 600 &> /dev/null &")
&> redirects both stdin and stderr at the same time.
If you want to redirect the output for logging, you can do so separately if you like:
ssh.exec!("command 2> error.log > output.log &")

Related

Mix input/output with Ruby IO?

I am hoping to write a small method that can interact with a subprocess (bash in this case) and should be able to both write commands and have those commands print their outback back to my shell when running the Ruby file.
So far, I can do something similar with this code:
require 'io/console'
#shell = IO.popen('/bin/bash', 'w')
def run(command)
puts command
#shell.puts command
puts 'Done'
end
run 'var=3'
run 'echo $var'
run 'sleep 2'
run 'ls docs'
#shell.close
And then when I run this code all of the Ruby code is printed first, and only later does any of the shell code get printed:
var=3
Done
echo $var
Done
sleep 2
Done
ls docs
Done
3
<ls output>
I was trying to read some of the tests for io/console as I'm almost certain there exists a really straightforward way to interact with a subprocess like this and get the output inline with the commands being run:
https://github.com/ruby/io-console/blob/master/test/io/console/test_io_console.rb

bash hangs when exec > > is called and an additional bash script is executed with output to stdin [duplicate]

I have a shell script which writes all output to logfile
and terminal, this part works fine, but if I execute the script
a new shell prompt only appear if I press enter. Why is that and how do I fix it?
#!/bin/bash
exec > >(tee logfile)
echo "output"
First, when I'm testing this, there always is a new shell prompt, it's just that sometimes the string output comes after it, so the prompt isn't last. Did you happen to overlook it? If so, there seems to be a race where the shell prints the prompt before the tee in the background completes.
Unfortunately, that cannot fixed by waiting in the shell for tee, see this question on unix.stackexchange. Fragile workarounds aside, the easiest way to solve this that I see is to put your whole script inside a list:
{
your-code-here
} | tee logfile
If I run the following script (suppressing the newline from the echo), I see the prompt, but not "output". The string is still written to the file.
#!/bin/bash
exec > >(tee logfile)
echo -n "output"
What I suspect is this: you have three different file descriptors trying to write to the same file (that is, the terminal): standard output of the shell, standard error of the shell, and the standard output of tee. The shell writes synchronously: first the echo to standard output, then the prompt to standard error, so the terminal is able to sequence them correctly. However, the third file descriptor is written to asynchronously by tee, so there is a race condition. I don't quite understand how my modification affects the race, but it appears to upset some balance, allowing the prompt to be written at a different time and appear on the screen. (I expect output buffering to play a part in this).
You might also try running your script after running the script command, which will log everything written to the terminal; if you wade through all the control characters in the file, you may notice the prompt in the file just prior to the output written by tee. In support of my race condition theory, I'll note that after running the script a few times, it was no longer displaying "abnormal" behavior; my shell prompt was displayed as expected after the string "output", so there is definitely some non-deterministic element to this situation.
#chepner's answer provides great background information.
Here's a workaround - works on Ubuntu 12.04 (Linux 3.2.0) and on OS X 10.9.1:
#!/bin/bash
exec > >(tee logfile)
echo "output"
# WORKAROUND - place LAST in your script.
# Execute an executable (as opposed to a builtin) that outputs *something*
# to make the prompt reappear normally.
# In this case we use the printf *executable* to output an *empty string*.
# Use of `$ec` is to ensure that the script's actual exit code is passed through.
ec=$?; $(which printf) ''; exit $ec
Alternatives:
#user2719058's answer shows a simple alternative: wrapping the entire script body in a group command ({ ... }) and piping it to tee logfile.
An external solution, as #chepner has already hinted at, is to use the script utility to create a "transcript" of your script's output in addition to displaying it:
script -qc yourScript /dev/null > logfile # Linux syntax
This, however, will also capture stderr output; if you wanted to avoid that, use:
script -qc 'yourScript 2>/dev/null' /dev/null > logfile
Note, however, that this will suppress stderr output altogether.
As others have noted, it's not that there's no prompt printed -- it's that the last of the output written by tee can come after the prompt, making the prompt no longer visible.
If you have bash 4.4 or newer, you can wait for your tee process to exit, like so:
#!/usr/bin/env bash
case $BASH_VERSION in ''|[0-3].*|4.[0-3]) echo "ERROR: Bash 4.4+ needed" >&2; exit 1;; esac
exec {orig_stdout}>&1 {orig_stderr}>&2 # make a backup of original stdout
exec > >(tee -a "_install_log"); tee_pid=$! # track PID of tee after starting it
cleanup() { # define a function we'll call during shutdown
retval=$?
exec >&$orig_stdout # Copy your original stdout back to FD 1, overwriting the pipe to tee
exec 2>&$orig_stderr # If something overwrites stderr to also go through tee, fix that too
wait "$tee_pid" # Now, wait until tee exits
exit "$retval" # and complete exit with our original exit status
}
trap cleanup EXIT # configure the function above to be called during cleanup
echo "Writing something to stdout here"

Unix output redirection

I have a script which prompts user to select options like 'y' or 'n'.
If 'y' is selected, the script proceeds with further execution and if 'n' is selected then it stops.
I want the output of this file to be re-directed to a log file. so used below command:-
./script stop >> script_RUN.log 2>&1
The problem is, the script starts running but does not prompt to ask for options like 'y' or 'n'
It is writing this to script_RUN.log.
How can I make the script to prompt user for options and re-direct the further execution to script_RUN.log?
you can try using tee command instead.
./script stop | tee script_RUN.log
NOTE:
Only the output of the program will be saved.
EDIT:
if you don't want to see the output on the console at all just redirect it into /dev/null
for example:
./script stop | tee script_RUN.log > /dev/null
the above line will write the file into log but dost NOT printout on console
This works like it has to really. You are redirecting stdout and stderr output from the very start. Instead you should try to redirect it in the script after the prompt. I think this would be helpful for you:
redirect COPY of stdout to log file from within bash script itself

How to Quit TFTP script

I have a tftp script here that when run it just hangs and brings me to a blank line (which tells me it's hanging). I can quit the script by Ctrl+C...
#!/bin/bash
hostname=$1;
filename=$2;
tftp <</dev/null
mode binary
get $hostname:$filename
quit
I have also tried to add EOF at the end of the script, but that doesn't work either.
Here is my command line...
$ ./tftpShell.sh host1 myFileName >/home/aayerd200/tftpoutput.txt 2>/home/aayerd200/tftperror.log
So when I run the script, it just leaves me on a blank line. However, it does actually do the work it should with get, I do get the file I want.
Of course host1 and myFileName are actual fields that I replaced here for security.
How can I stop this script? I believe it is just tftp hanging upon $ ps -u aayerd200, or when run by php $ ps -u daemon
You have /dev/null as a here document "delimiter" Try some random set of characters like EOF that have no meaning to the shell. And terminate the here doc
tftp <<-EOF
mode binary
get $hostname:$filename
quit
EOF
Okay so I just made this a background process by appending & to the end of the command. Then I ran $ echo $! for the PID. Then I ran $ kill PID.
That was my solution to this, for now at least.

Ruby system command is giving me an output

This ruby 'system' gives me an output; on irb:
system("sudo airodump-ng -w sidney wlan0")
Airodump-ng is from the Airocrack-ng package.
However, the ruby "system" should not give me a stdout.
The thing is, that a "sh" processus is being created, which doesn't have an output. But the "sh" processus got a child processus, which gives me a output that I don't want at all to be displayed on my terminal.
Second part of the question, how can I get the pid of that sub-processus, using threads and maybe a different way to call a shell command using ruby (and not displaying the output of that child processus) ?
If you don't care about the output, trash it:
system("sudo airodump-ng -w sidney wlan0 >/dev/null 2>&1")
I think the child process will inherit the parent's file descriptors.
Use
out = `sudo airodump-ng -w sidney wlan0`
instead, and output will not show on screen, but stored in out instead

Resources