Redirect ssh output to file while performing other commands - bash

I'm looking for some help with a script of mine. I'm new at bash scripting and I'm trying to start a service on a remote host with ssh and then capture all the output of this service to a file in my local host. The problem is that I also want to execute other commands after this one:
ssh $remotehost "./server $port" > logFile &
ssh $remotehost "nc -q 2 localhost $port < $payload"
Now, the first command starts an HTTP server that simply prints out any request that it receives, while the second command sends a request to such server.
Normally, if I were to execute the two commands on two separate shells I would get the first response on the terminal, but now I need it on the file.
I would like to have the server output all the requests on the log file, keeping a sort of open ssh connection to receive any new output of the server process.
I hope I made myself clear.
thank you for your help!
EDIT: Here's the output of the first command:
(Output is empty in the terminal... it waits for requests).
As you can see the commands doesn't return anything yet but it waits.
When I execute the second command on a new terminal (the request), the output of the first terminal is the following:
The request is displayed.
Now I would like to execute both commands in sequence in a bash script, sending the output of the first terminal (which is null until the second command is run) to a file so that ANY output, triggered by later issued requests, is sent to a file.
EDIT2: As of now, with the commands above, the server answers any requests but the output is not registered in the log file.

Related

linux expect in background

I use the following bash script to connect to pbx using telnet:
expect.sh:
#!/usr/bin/expect
spawn telnet [ip] 2300
expect -exact "-"
send "SMDR\r";
expect "Enter Password:"
send "PASSWORD\r";
interact
and created another script to redirect the result to a file:
#!/bin/bash
./expect.sh | tee pbx.log
I'm trying to run expect.sh at boot time so I added it to systemd. When I add it as service in /etc/systemd/system it runs but I can't get the results in the log file as if I run both scripts manually
any idea about how can I run it at boot time?
TIA
If you just want to permanently output everything received after providing your password, simply replace your interactive with expect eof, i.e. wait for end-of file which will happen when the connection is closed by the other end. You will probably also want to change the default timeout of 10 seconds with no data that will stop the command:
set timeout -1
expect eof

Why does ack over ssh not work?

I've got a simple bash script to remove some folders on a remote server over ssh. It basically does this:
THE_HOST=12.34.56.78
ssh me#$THE_HOST "rm /the/file/path/thefile.zip"
This works perfectly well. Before I do this I often search the contents of the files in a folder for a string using ack:
ack thestring /the/folder/path/
This works perfect when I ssh into the server and run it, but when I use it in one command it doesn't work:
ssh me#$THE_HOST "ack thestring /the/folder/path/"
This seems to freeze or run forever: I get no output and the command never ends. Does anybody know why this doesn't work for ack?
Could be ack behaves differently when it is run in a terminal. Try using the -t argument
ssh -t me#$THE_HOST "ack thestring /the/folder/path/"
When ack detects that stdin is not a terminal(a tty device), it will attempt to read the text to search in from stdin instead of the given file/folder. That's what happens when you run it through ssh, stdin will be connected to the ssh connection, which does not look like a terminal(tty) to ack.
The -t argument to ssh instead allocates a tty and connects it to stdin/out of the program you run, ack will then think it runs in a terminal and instead use the file/folder argument for searching.
See http://github.com/beyondgrep/ack2/issues/659

prevent output from script in xinetd service

I have a bash script that starts Xvnc after doing some other processing, and it's launched from a xinetd service. However the script indirectly outputs some text to stdout and stderr, which gets sent back to the connecting client.
Is there some way to tell xinetd to ignore any ouput from the script and just let Xvnc take over the connection?
(I assume Xvnc somehow takes over the socket from xinetd, rather than just using stdout to communicate with xinetd?)
Put the following lines into the script at the beginning:
exec >/dev/null
exec 2>/dev/null

SSH command within a script terminates prematurely

From myhost.mydomain.com, I start a nc listener. Then login to another host to start a netcat push to my host:
nc -l 9999 > data.gz &
ssh repo.mydomain.com "cat /path/to/file.gz | nc myhost.mydomain.com 9999"
These two commands are part of a script. Only 32K bytes are sent to the host and the ssh command terminates, the nc listener gets an EOF and it terminates as well.
When I run the ssh command on the command line (i.e. not as part of the script) on myhost.mydomain.com the complete file is downloaded. What's going on?
I think there is something else that happens in your script which causes this effect. For example, if you run the second command in the background as well and terminate the script, your OS might kill the background commands during script cleanup.
Also look for set -o pipebreak which terminates all the commands in a pipeline when one of them returns with != 0.
On a second note, the approach looks overly complex to me. Try to reduce it to
ssh repo.mydomain.com "cat /path/to/file.gz" > data.gz
(ssh connects stdout of the remote with the local). It's more clear when you write it like this:
ssh > data.gz repo.mydomain.com "cat /path/to/file.gz"
That way, you can get rid of nc. As far as I know, nc is synchronous, so the second invocation (which sends the data) should only return after all the data has been sent and flushed.

Opening an ssh connection and keeping it open on startup

I need to open an ssh connection on login and keep it open, but to not acutally do anything with it. It would be best if all of it would run in the background.
I created an automator application and made it run a shell script on the bash. The script looks as follows:
sshpass -p 123456 ssh 123456#123.123.123.123
If i try to run the application i keep getting an error message, however if i execute the exact same script in an terminal it works just fine.
Is there any way i can open that connection with an automator application and keep in the background?
You can send a KeepAlive packet to stop the pipe from closing.
In your ~/.ssh/config, and the following:
Host *
ServerAliveInterval 300
ServerAliveCountMax 2
What this says is that that every 300 seconds, send a null (keep-alive) packet and give up after 2 tries.
Source: http://patrickmylund.com/blog/how-to-keep-alive-ssh-sessions/
Do you really need to involve Automator at all?
Just save the script (say, foo.sh) in a folder with the same name as the script (i.e. foo.sh as well).
Put this folder in /System/Library/StartupItems/ and it will run when you start up your machine.

Resources