Ruby system command is giving me an output - ruby

This ruby 'system' gives me an output; on irb:
system("sudo airodump-ng -w sidney wlan0")
Airodump-ng is from the Airocrack-ng package.
However, the ruby "system" should not give me a stdout.
The thing is, that a "sh" processus is being created, which doesn't have an output. But the "sh" processus got a child processus, which gives me a output that I don't want at all to be displayed on my terminal.
Second part of the question, how can I get the pid of that sub-processus, using threads and maybe a different way to call a shell command using ruby (and not displaying the output of that child processus) ?

If you don't care about the output, trash it:
system("sudo airodump-ng -w sidney wlan0 >/dev/null 2>&1")
I think the child process will inherit the parent's file descriptors.

Use
out = `sudo airodump-ng -w sidney wlan0`
instead, and output will not show on screen, but stored in out instead

Related

Send echo command to an external xTerm

I have a bash script, and I want to be able to keep a log in an xterm, and be able to send echo to it anytime.
How would I do this?
Check the GPG_TTY variable in your xterm session. It should have the value similar to
GPG_TTY=/dev/pts/2
This method should be available for terminals that support GNU Pinentry.
Another option to determine the current terminal name is to use
readlink /proc/self/fd/0
The last method applies only to Linux
Now if your bash script implements a command
echo "Hello, world!" > /dev/pts/2
This line should appear on the xterm screen.
I managed to make a console by running an xterm with a while loop clearing the screen, reading the contents of the log file, pauseing for a second, then looping again. Here was the command:
xterm -T Console -e "while true: do cls && cat ${0}-LOG.txt && sleep 1; done"
Then to send something to the console:
echo -e "\e[91;1mTest" >> ${0}-LOG.txt
And the console will update each second.

How to Quit TFTP script

I have a tftp script here that when run it just hangs and brings me to a blank line (which tells me it's hanging). I can quit the script by Ctrl+C...
#!/bin/bash
hostname=$1;
filename=$2;
tftp <</dev/null
mode binary
get $hostname:$filename
quit
I have also tried to add EOF at the end of the script, but that doesn't work either.
Here is my command line...
$ ./tftpShell.sh host1 myFileName >/home/aayerd200/tftpoutput.txt 2>/home/aayerd200/tftperror.log
So when I run the script, it just leaves me on a blank line. However, it does actually do the work it should with get, I do get the file I want.
Of course host1 and myFileName are actual fields that I replaced here for security.
How can I stop this script? I believe it is just tftp hanging upon $ ps -u aayerd200, or when run by php $ ps -u daemon
You have /dev/null as a here document "delimiter" Try some random set of characters like EOF that have no meaning to the shell. And terminate the here doc
tftp <<-EOF
mode binary
get $hostname:$filename
quit
EOF
Okay so I just made this a background process by appending & to the end of the command. Then I ran $ echo $! for the PID. Then I ran $ kill PID.
That was my solution to this, for now at least.

tee to a log within a bash script, while preserving stdout as a TTY

Similar to redirect COPY of stdout to log file from within bash script itself, but I'd also like to preserve stdout as a TTY device.
For example, I have the following scripts:
/tmp/teed-off$ cat some-script
#!/usr/bin/env ruby
if $stdout.tty?
puts "stdout is a TTY"
else
puts "stdout is NOT a TTY"
end
/tmp/teed-off$ cat wrapper
#!/usr/bin/env bash
exec > >(tee some-script.log)
./some-script
When I run them, the wrapper eats stdout as a TTY device:
/tmp/teed-off$ ./some-script
stdout is a TTY
/tmp/teed-off$ ./wrapper
stdout is NOT a TTY
How can I flip that behavior around so that the script believes that its in a TTY even when executed via the wrapper?
It won't be trivial, but I think you can do it via pseudo-ttys. I'm not sure that there's any standard tool, other than perhaps expect, that would do it for you.
It takes a bit of thinking about. You'd have a control program that would open the pseudo-tty master, then the slave. The slave would be connected to the output of ./some-script. The master would be read by the control program, which would copy the data it reads from the master to the file and to standard output.
I've not tried coding that up. I'm not sure whether you could do it with standard shell commands; I can't think of any way. So, I think there will be some C coding to be done.
look for dup2 it duplicates a file descriptor
int dup2(int oldfd, int newfd);

Echoing 'at' command in terminal fails

The following should print "hello" (or some reminder) on my Linux command line at 9:00 AM today:
$ at 9:00AM
warning: commands will be executed using /bin/sh
at> echo "hello"
at> <EOT>
However, at the specified time, nothing happens.
I have an empty etc/at.deny and no /etc/at.allow file, so there shouldn't be any problems with permissions to use the command. Also, writing a file at 9:00 AM works:
$ at 9:00AM
at> echo "hello" > /home/mart/hello.txt
at> <EOT>
$ cat /home/mart/hello.txt
hello
All jobs are shown as scheduled, I just can't get any output to the terminal window (I'm on Crunchbang Linux with Terminator). Why? Do I need to somehow specify the window for that output?
Thanks for any help!
at runs commands from a daemon (atd), which doesn't have access to your terminal. Thus, output from the script isn't going to appear in your terminal (unless you pipe to the right tty in your command).
Instead, it does as man at says:
The user will be mailed standard error and standard output from his commands, if any.
You may be able to access these reports using mail if your machine is suitably configured.
If you want to have at write to your terminal, you can try piping the output to write, which writes a message to a user's TTY, or to wall if you want to write to every terminal connected to the system.
Okay, nneonneo's explanations led me to using wall, which sends a message to all users. So setting oneself reminders in a terminal window can be done like this:
$ at 9:00AM
warning: commands will be executed using /bin/sh
at> echo "hello" | wall
at> <EOT>

Run a process and get the result in clipboard (or kill ring) with elisp/emacs

I use the following code to run "ls -l ./" and get the result in scratch buffer.
(start-process "my-process" "*scratch*" "ls" "-l" "./")
How can I get the result in clipboard or something (kill ring or whatever) so that I can easily copy the result whenever necessary?
You can adjust this to your liking:
(kill-new (shell-command-to-string "ls -l ."))
The call to kill-new will put the string from shell-command-to-string on the kill ring.
shell-command (bound to M-!) runs a shell command and puts its output in *Shell Command Output*. Given an argument (eg: M-1 M-!) it will put the results in the current buffer.
A little more information is available on the page ExecuteExternalCommand on the Emacs wiki

Resources