Cannot successfully disconnect from remote machine using 'nohup' or 'screen' - bash

I am trying to do some work on a remote machine and disconnect without terminating the work. I have tried both nohup and screen, unfortunately it is not working out. After I type exit to logout my work also terminates immediately.
I am trying to run 108 simulations on a remote machine. For that purpose I have written a script named batch.sh which runs one simulation after the other until all 108 are done. The program that actually runs a simulation launches 5 programs in 5 different terminals (using xterm -e). I run batch.sh using:
nohup bash batch.sh &
As long as I am connected everything works just fine. If I disconnect and then reconnect to check whether everything is working as it should...no joy :(
Are there any caveats I am overlooking? Possibly because my program launches other programs in external terminals?
UPDATE
If I use the suggestions of adding -oForwardX11=no to ssh and unset DISPLAY before launching my script I get these errors:
nohup: ignoring input and appending output to nohup.out
In nohup.out I have these messages:
xterm Xt error: Can't open display:
xterm: DISPLAY is not set

Apparently your script/program is trying to launch xterm on its own. These days many systems enable X11 forwarding for their SSH client by default - as a result the DISPLAY variable is set in your shell session but becomes invalid once you disconnect. Therefore, as long as you are connected to the remote system, the xterm processes can access the X server on your local machine through the SSH connection, but die once that connection is severed.
I have occasionally encountered the same issue with Java programs that use e.g. the Java AWT subsystem to generate image files, even when there is no actual graphical window. You should first see if your program will somehow adapt if there is no X server available. One option is to disable X11 forwarding with the -oForwardX11=no option to ssh:
$ ssh -oForwardX11=no user#server.host.name
You could also try unsetting the DISPLAY environment variable before starting your script and see what happens.
However, if your program is launching xterm windows indiscriminately then you'd have to make it e.g. use an output file on the server instead - by modifying it, if necessary. As an added advantage, you would get rid off the network load and timing overhead involved with forwarded X connections.
If you cannot change the way your program works and you do not actually care about the output in those xterm windows, then you could try launching a virtual framebuffer X server on the remote system and have your script use that for xterm.

Related

Bash script to send commands to remote ssh session

Is it possible to write a bash script that opens a remote node (i.e. through ssh and/or slurm) and starts an interactive session there after running some commands? I'm trying to automate the process of starting a jupyter session on a remote computing cluster, which currently looks like this:
ssh into a login node of the remote cluster, using a specific port
use slurm to request an interactive session on one of the compute nodes, including x11 forwarding through that port
change directory to the working directory
activate conda environment for my project
open jupyter from the command line, specifying the port I used previously
It's a lengthy process, and if I get something wrong at any step I usually have to go back and start from the beginning because the port I'm using is still tied up. So I'm looking for a way I can run a single script (possibly with arguments) from my local machine that jumps through all the hoops to get me a working jupyter session with a link I can paste to my browser.
Like #Diego Torres Milano said, you would need to write a script locally that could do the interactive part, then invoke that via a remote script.
But since your process is interactive, this gets tricky. Luckily, linux has a tool which can easily be installed via a package manager called expect which has the ability to write logic to execute multi-step interactive scripts.
So you would write an expect script which would "expect" certain prompts, then it can read those prompts and use conditional logic respond to those prompts appropriately.
Once you have this written and it works locally, it's just a matter of executing it via ssh from a remote server as:
ssh user#12.34.56.78 /path/to/script.ex

Redirect xterm to a background for a headless machine

I have a application that launches xterm and dumps uart logs. I am able to see it launch and dump the logs in the GUI. However, Using a remote session I want the xterm output to be running as a background process somewhere so that I can switch back and forth within a single terminal.
Using GUI
Using remote terminal (SSH)
$ xterm
xterm: Xt error: Can't open display: :0
I tried to do something like, but failed to work -
alias xterm="/bin/bash -c"
I don't want to have X forwarding and launch a window on my local machine as well.
If you just need the logs, you most likely don't need an X server or xterm.
You can simply run the target command itself. From your screenshot it looks like the command might be telnet 127.0.0.1 <port_number>. You can find it from the script that your application launches, or with ps -ef when it's running. If it's an UART, then you can also use minicom or socat to connect directly to serial port without any extra programs. This way, you don't even need telnet.
You can combine this command with either screen or tmux so that it's running in the background and you can switch to it from any terminal or console. Just run screen with no arguments, then run the command on virtual screen. Detach with CTRL-a d, and your command will continue to run in the background ready for you to reconnect to it at any time with screen -r.
Moreover, screen can also connect to serial port directly so you get two for the price of one.
The thing with xterm is that it will not write the logs anywhere except in the graphics buffer, and even there it will be only as flashing pixels which is not suitable for any processing. If you insist on going that way, you have several options:
Change the script that application runs (might not be possible depending on your situation)
Replace /usr/bin/xterm with your dummy script that just runs bash instead of xterm, and redirects the output to a file (ugly, but you could probably avoid breaking other applications by changing PATH and putting it somewhere else). In your script, you can use bash's redirection features such as >, or pipe output to tee.
Start a VNC server in the background and set the DISPLAY environment variable when you run your application to the number of virtual screen. In this case, any windows from application will open on VNC virtual screen and you can connect to it as you please.
Use xvfb as a dummy X server and combine it with xterm logging, etc.
Solution 1: Fake xterm on X11-less systems
You can also create a wrapper script that replaces xterm with another function. Test this out on a laptop with X11:
$ function xterm {
echo "hello $#"
}
$ xterm world 1
hello world 1
$ export -f xterm
$ /bin/xterm # opens a new xterm session
$ xterm world 2 # commands executed in second terminal
hello world 2
This means that you've replaced the command xterm for a function in all of the child processes.
Now, if you already know that your script will work in a terminal without xterm, you could create a function that accepts all of the parameters and executes it. No need for complicated screen stuff or replacing /usr/bin/xterm.
Solution 2: Dump UART data for the winz
If you want to save all of the uart data into a file, this is easily fixed by creating a screen session and a log file. Below the command will create a session named myscreensessionname that listens on the serial connection /dev/ttyUSB0 and writes its data to /home/$USER/myscreensessionname.log.
$ screen -dmS myscreensessionname -L -Logfile /home/$USER \
/myscreensessionname.log /dev/ttyUSB0 115200
Note that if you're going to use multiple screen sessions, you might want to use serial ids instead of /dev/ttyUSB0. You can identify the connections with udevadmin as follows.
$ udevadm info --name=/dev/ttyUSB0 | grep 'by-id'
S: serial/by-id/usb-FTDI_TTL232R-3V3_FTBDBIQ7-if00-port0
E: DEVLINKS=/dev/serial/by-id/usb-FTDI_TTL232R-3V3_FTBDBIQ7-if00-port0 /dev/serial/by-path/pci-0000:00:14.0-usb-0:4.4.4.1:1.0-port0
Here, instead of /dev/ttyUSB0, I would make use of /dev/serial/by-id/usb-FTDI_TTL232R-3V3_FTBDBIQ7-if00-port0.
EDIT:
You can attach the screen session with the following command. Once in the screen session, press crtl+a, and press d to detach.
$ screen -Dr myscreensessionname
To view all of your screen sessions:
$ screen -list
There is a screen on:
2382.myscreensessionname (04/02/2021 10:32:07 PM) (Attached)
1 Socket in /run/screen/S-user.

Is there a way to get my laptop to beep from within a bash script running on a remote server via SSH?

I have a bash script that I have to regularly run on a remote server. Part of the script includes running a backup which takes a while, and after it has run, I have to hit "Y" to confirm that the backup worked before the script will continue.
I would like to know if there is a way to get my laptop to make a beep (or some sort of sound) when that happens. I know that echo -e '\a' makes a beep, but if I run it from within a script on the remote server, the beep happens on the remote server.
I have control of the script that is being run, so I could easily change it to do something special.
You could send the command through ssh back to your computer like:
ssh user#host "echo -e '\a'"
Just make sure you have ssh key authentication from your server to your computer so the command can run smoothly
In my case the offered solutions with echo didn't work. I'm using a macbook and connect to an ubuntu system. I keep the terminal open and I'd like to be informed when a long running bash script is ready.
What I did notice is that if I shutdown the remote system then it will beep the macbook and show an alarm icon on the relevant tab. So I have now implemented a bit of dirty workaround:
sudo shutdown 1440 && shutdown -c
This will initiate the system to shutdown and will immediately cancel the request. And I do get the alarm beep + icon. You will need to setup sudo to allow the user to permit shutdown. As it was my own remote server it was no problem but could limit the usability for others.

Bash: Script output to terminal session stops but script finishes normal

I'm opening an ssh-session to a remote server and execute a larger (around 1000 lines) bash-script on the remote machine. It involves several very CPU-intensive calls which run for up to three minutes each. To track the scripts progress it echoes messages placed at several points in the script.
In general the script runs smoothly. From time to time the script runs trough (the resulting file on the remote machine is correct) but the output to the terminal stops. Ctrl-C doesn't help, no prompt, just a frozen session. top in a separate session shows normal execution of the script.
My question: How keep the session alive?
local machine:
$ sw_vers
ProductName: Mac OS X
ProductVersion: 10.9
BuildVersion: 13A603
remote machine:
$ lsb_release -d
Description: Ubuntu 12.04.3 LTS
Personally, I would recommend using screen or tmux on the remote terminal for exactly this reason.
Those apps will allow the remote process to continue even if your local SSH session times out.
http://www.bangmoney.org/presentations/screen.html
http://tmux.sourceforge.net/
Start a screen on the remote machine and run your command from it:
screen -S largeScript
And then
./yourLargeScript.sh
Whenever your ssh session gets frozen, you can kill it with ~.
If you ssh again, you can grab back your screen by:
screen -dr largeScript
Make it log to a file instead (perhaps via syslog), and tail that file from wherever is convenient for you. This also helps detach the script so you can run it headless, from a cron job, etc. Also, if the log file has read access for others, they too can monitor it.

Running remotely Linux script from Windows and get execution result code

I have the current scenario to deal with:
I have to schedule the backup of my company's Linux-based server (under Suse Linux) with ARCServe R15 (installed on Windows 2003R2SP2).
I know I have the ability in my backup software (ARCServe) to add pre/post execution scripts to my backup-jobs.
If failure of the script, ARCServe would be specified NOT to run the backup-job, and if success, specified to be run. I have no problem with this.
The problem is, I want to make a windows script (to be launched by ARCServe) for executing a Linux script on the cluster:
- If this Linux-script fails, I want my windows-script to fail, so my backup job in ARCServe wouldn't run
- If the Linux-script success, I want my windows-script to end normally with error code 0, so my ARCServe job would run normally.
I've tried creating this batch file (let's call it HPC.bat):
echo ON
start /wait "C:\Program Files\PUTTY\plink.exe" -v -l root -i "C:\IST\admin\scripts\HPC\pri.ppk" [cluster_name] /appli/admin/backup_admin
exit %errorlevel%
If I manually launch this .bat by double-clicking on it, or launching it in a command prompt under Windows, it executes normally and then ends.
If I make it being launched by ARCServe, the script seems never to end.
My job stays in "waiting" status, it seems the execution code of the linux script isn't returned to my batch file, and this one doesn't close.
In my mind, what's happening is plink just opens the connection to the Linux, send the sript execution signal, and then close the connection, so the execution code can't be returned to the batch. Am I right ?
Is what I want to do possible or am I trying something impossible to do ?
So, do I have to proceed differently ?
Do I have to use PUTTY or CygWin instead of plink ?
Please, it's giving me headaches ...
If you install Cygwin, you could do it exactly like you can do it on Linux to Linux, i.e. remotely run a command with ssh someuser#remoteserver.com somecommand
This command will return with the same return code on the calling client, as the command exited with on the remote end. If you use SSH shared keys for authentication instead of passwords, it can also be scripted without user interaction.

Resources