How to escape pipes in .screenrc for commands to run at startup? - bash

I'm using byobu/screen, and I would like to have a new screen session default to containing a few windows set up specially for tailing a specific log file.
My .screenrc looks something like this (technically this is my .byobu/windows file):
chdir /home/matt/code/project/logs
screen -t 'logs' tail -F current.log
chdir /home/matt/code/project
screen -t 'errors' tail -F current.log | grep -A 3 "ERROR"
chdir /home/matt/code/project
screen -t 'project'
chdir
screen -t 'bash'
My intention is to set up four windows in the new screen session:
A window titled "logs" which tails the current.log file
A window titled "errors" which tails the current.log file and greps for ERROR
A window titled "project" which starts in my project's main directory
A window titled "bash" which starts in my home directory.
However, the pipe in the screen -t 'errors' tail -F current.log | grep -A 3 "ERROR" command ends up being interpreted by screen literally, and thus my second window never appears.
How can I escape the pipe in this command to have it interpreted as I wish?
Furthermore, is there an easier way to setup screen/byobu to launch windows that are running (complicated) commands at startup?

I ended up solving this by using the stuff command to simulate entering a command in the window and pressing enter to execute it. This has the nice effect of making it possible to break out of the tail command in the screen window without also killing the window itself.
Here is an example of what my .screenrc looks like to accomplish this; I've written up a longer explanation on my blog:
screen -t 'errors'
stuff 'tail -F /var/ec/current.log | grep -A 3 "ERROR"^M'
(the ^M is entered by pressing Ctrl+V, Enter with your keyboard, not by actually typing caret and uppercase M)

The following works for me:
screen -t errors bash -c "tail -F current.log | grep -A 3 ERROR"
The use of bash (or other shell) is required to prevent screen from giving a "file not found"-error, which will be the result if bash -c is removed from the above.

You should be fine with creating custom script and use it in your .screenrc - so you would have screen -t 'error' ./bin/current.log.sh
And tail -F ... in current.log.sh

Related

How to capture the output of a bash command which prompts for a user's confirmation without blocking the output nor the command

I need to capture the output of a bash command which prompts for a user's confirmation without altering its flow.
I know only 2 ways to capture a command output:
- output=$(command)
- command > file
In both cases, the whole process is blocked without any output.
For instance, without --assume-yes:
output=$(apt purge 2>&1 some_package)
I cannot print the output back because the command is not done yet.
Any suggestion?
Edit 1: The user must be able to answer the prompt.
EDIT 2: I used dash-o's answer to complete a bash script allowing a user to remove/purge all obsolete packages (which have no installation candidate) from any Debian/Ubuntu distribution.
To capture partial output from that is waiting for a prompt, one can use a tail on temporary file, potentiality with 'tee' to keep the output flowing if needed. The downside of this approach is that stderr need to be tied with stdout, making it hard to tell between the two (if this is an issue)
#! /bin/bash
log=/path/to/log-file
echo > $log
(
while ! grep -q -F 'continue?' $log ; do sleep 2 ; done ;
output=$(<$log)
echo do-something "$output"
) &
# Run command with output to terminal
apt purge 2>&1 some_package | tee -a $log
# If output to terminal not needed, replace above command with
apt purge 2>&1 some_package > $log
There is no generic way to tell (from a script) when exactly a program prompts for input. The above code looks for the prompt string ('continue?'), so this will have to be customized per command.

for-Loop in screen does not work

I would like to use screen to stay attached to a loop command on a ssh session, which is most likely going to run for a couple of hours. I am using screen because I fear that my terminal will get disconnected while the command is still running. This is the loop-command:
for i in *; do echo $i/share/sessions/*; done
(echo will be replaced by rm -rf).
I have tried multiple variants of screen 'command ; command ; command', but never got it working. How can I fix this? Alternatively, could you suggest a workaround for my problem?
Screen for long running commands can be used like this :
$screen -S session_name
//Inside screen session
$ <run long running command>
$ //press key combination - Ctrl + a + d - to come out of screen session
// Outside screen session
// Attach to previously created session
$screen -x session_name
For more details look at man page of screen.
Another application which works similar way and is very popular is tmux
I assume that you're trying to run:
screen 'for i in *; do echo $i/share/sessions/* ; done'
This results in a Cannot exec [your-command-here]: No such file or directory because screen doesn't implicitly start a shell; rather, it calls an execv-family syscall to directly invoke the program named in its argument. There is no program named for i in *; do echo $i/share/sessions/*; done, and no shell running which might interpret that as a script, so this fails.
You can, however, explicitly start a shell:
screen bash -c 'for i in *; do echo $i/share/sessions/* ; done'
By the way -- running one copy of rm per file you want to delete is going to be quite inefficient. Consider using xargs to spawn the smallest possible number of instances:
# avoid needing to quote and escape the code to run by encapsulating it in a function
screenfunc() { printf '%s\0' */share/sessions/* | xargs -0 rm -rf; }
export -f screenfunc # ...and exporting that function so subprocesses can access it.
screen bash -c screenfunc
There is no need really for screen here.
nohup rm -vrf */share/sessions/* >rm.out 2>&1 &
will run the command in the background, with output to rm.out. I added the -v option so you can see in more detail what it's doing by examining the tail of the output file. Note that the file won't be updated completely in real time due to buffering.
Another complication is that the invoking shell will do a significant amount of work with the wildcard when it sets up this job. You can delegate that to a subshell, too:
nohup sh -c 'rm -rvf */share/sessions/*' >rm.out 2>&1 &

copying last bash command into clipboard

I realized that I've spent more time on this issue than necessary, hence the question.
Sometimes I need to save the last typed shell command into the clipboard.
I can do something like this:
echo !! | xsel --clipboard
Which works successfully.
But when I try to alias the above command:
alias echoxs='echo !! | xsel --clipboard'
Things do not work as expected. In particular, the clipboard contents become literally !!. Obviously, I am missing something about how bash preprocesses commands and aliases. My hope was that an alias, as is intuitive, would be something like a C macro, and that typing the alias would be equivalent to typing its target.
I've tried other approaches and none seem to work. Using HISTFILE inside a script does not work because either commands are cached by the shell session and not immediately written to the file, or multiple terminals mess with the file such that the last command in the file is not always reliably the last command in the current session.
alias='history 1 | xsel --clipboard'
Almost works, except all fails when attempting to modify (eg, cut or sed) the output of history because it is a built-in command.
Is the a way to get the shell's last command through sane stdout?
I'm not sure to understand what you said about "failing when attempting to modify the output of history", so I hope my solution will suit you. I'm using fc to get the last command:
fc -ln -1 | xsel --clipboard
Here are the meaning of the options:
l is to use the standard output
n is to hide the command history number
-1 is to get the last command from the history
Client: pass the option -XY to the ssh command to enable (trusted) X11 forwarding for this session:
ssh -XY USER#IP
Server: check /etc/ssh/sshd_config to make sure X11 forwarding is enabled on server
X11Forwarding yes
yum install xclip -y
echo `hostname -I` `hostname` >> /etc/hosts
echo "alias cplastcmd='history 2 | cut -c 8- | head -n 1 | xclip -selection clipboard'" >> ~/.bashrc
Restart bash and type cplastcmd to copy last bash command to clipboard via X11.

How do I get the command history in a screen session using Bash?

If I start a screen session with screen -dmS name, how would I access the command history of that screen session with a script?
Using the ↑, the last executed command appears, even in screen.
I use the default bash shell on my system and so might not work with other shells.
this is what I have in my ~/.screenrc file so that each new screen window gets its own command history:
Default Screen Windows With Own Command History
To open a set of default screen windows, each with their own command history file, you could add the following to the ~/.screenrc file:
screen -t "window 0" 0 bash -ic 'HISTFILE=~/.bash_history.${WINDOW} bash'
screen -t "window 1" 1 bash -ic 'HISTFILE=~/.bash_history.${WINDOW} bash'
screen -t "window 2" bash -ic 'HISTFILE=~/.bash_history.${WINDOW} bash'
Ensure New Windows Get Their Own Command History
The default screen settings mean that you create a new window using Ctrl+a c or Ctrl+a Ctrl+c. However, with just the above in your ~/.screenrc file, these will use the default ~/.bash_history file. To fix this, we will overwrite the key bindings for creating new windows. Add this to your ~/.screenrc file:
bind c screen bash -ic 'HISTFILE=~/.bash_history.${WINDOW} bash'
bind ^C screen bash -ic 'HISTFILE=~/.bash_history.${WINDOW} bash'
Now whenever you create a new screen window, it's actually launching a bash shell, setting the HISTFILE environmental variable to something that includes the current screen window's number ($WINDOW).
Command history files will be shared between screen sessions with the same window numbers.
Write Commands to $HISTFILE on Execution
As is normal bash behavior, the history is only written to the $HISTFILE file by upon exiting the shell/screen window. However, if you want commands to be written to the history files after the command is executed, and thus available immediately to other screen sessions with the same window number, you could add something like this to your ~/.bashrc file:
export PROMPT_COMMAND="history -a; history -c; history -r; ${PROMPT_COMMAND}"
screen doesn't maintain a history of the commands you type. Your shell may or may not keep a history. Since you appear to use bash, you can use the history command.
screen does appear to have a crude approximation of a history search (it merely searches the scrollback buffer for a command line. See the screen man page under the "history" command (bound to C-a { by default).
#technosaurus is right. $HISTFILE is written when bash exits, so you could exit one bash session, and start a new one, and the history should have been preserved through the file.
But I think there is a better way to solve your problem. The bash manual includes a description of the history built-in command. It allows you to save this history with history -w [filename] and read the history with history -r [filename]. If you don't provide a filename, it will use $HISTFILE.
So I would propose that you save the history inside your screen session to a specific file (or to your default $HISTFILE if you want). Then read the history file in the other bash session you want to access the history from. This way you don't have to exit the original bash session.
When you exit a terminal (or shell) the shell writes its history to $HISTFILE, so to get its history in another terminal you can type exit in the terminal you want the history of and it will get written.
cat $HISTFILE
#or tac, less, $EDITOR, ... depending on how you want to "access" it
use this:
screen -L
with capital L
it will store a copy of terminal input and output to a file named screenlog.0
or if you use -S to name it, the log file gets the screen name
I put the next lines into my .bashrc:
case "$TERM" in
screen)
declare SCREEN_NAME=$(echo $STY | sed -nr 's/[^.]*\.(.*)/\1/p')
if [[ $SCREEN_NAME ]]; then
HISTFILE="${HISTFILE}.${SCREEN_NAME}.${WINDOW}"
declare -p HISTFILE
fi
unset SCREEN_NAME
;;
*)
;;
esac
My default .bashrc has this 'case' basically with 'xterm*|rxvt*)' value, so I only added my 'screen' part into it. If you have not this 'case', you can use the next instead of it:
if [[ $TERM == screen ]]; then
declare SCREEN_NAME=$(echo $STY | sed -nr 's/[^.]*\.(.*)/\1/p')
if [[ $SCREEN_NAME ]]; then
HISTFILE="${HISTFILE}.${SCREEN_NAME}.${WINDOW}"
declare -p HISTFILE
fi
unset SCREEN_NAME
fi
And after I have an own bash_history for all window of my all screen.
Note: this not work in chroot!
history will show all the history command.

shell: how to make tail of a file running in background

I want to run a few tasks in shell.
tail a file into a new file: for example: tail -f debug|tee -a test.log
at the same time, run other tasks.
My question is: how to make the command tail -f debug|tee -a test.log run in background, so that I can run other tasks then in shell script?
You don't need tee at all for this, just use the shell's built-in append operator:
tail -f debug >> test.log &
The trailing & works as normal in the shell. You only need tee to send the output to a file as well as standard out, which if you have it in the background probably isn't what you want.
Normally you just use an ampersand after the command if you want to background something.
tail -f debug|tee -a test.log &
Then you can bring it back to the foreground later by typing fg. Did this answer your question or have I missed what you were asking?
The simple way to do this is:
screen -R -D
tail -f debug|tee -a test.log
Ctrl-A c
ps ax |grep tail
Ctrl-A [Backspace]
...
Ctrl-A [Spacebar]
screen lets you run multiple terminal sessions on one terminal connection. You switch back and forth with Ctrl-A [Backspace]|[Space]. To create another separate shell Ctrl-A c
A major benefit of screen is that if the terminal session disconnects, it keeps everything runnning. Just close the terminal window or disconnect ssh, go to another computer, log in and run screen -R -D to reconnect to everything which is still running.
If you only occasionally need this, just run tail, type Ctrl-Z, run a command, then fg %1 to bring the tail process back into the foreground, or bg %1 to make it run in the background permanently. If you do use Ctrl-Z, then the jobs command shows all of your detached jobs.

Resources