fish shell login commands keep running on screen or tmux session after login - bash

I've just switched to fish-shell
And I've used the instructions of How do I run a command every login? What's fish's equivalent to .bashrc?
Which means I've moved the commands which i prefer to run upon login from .bashrc to ~/.config/fish/config.fish
But right now the commands keep running if i open screen or tmux session ! but before while i were using the default shell that's was never happens (meant that the commands were only run during the login and never re-run in screen session)
How to avoid this?
Thanks in advance.

You can test for the TERM environmental variable to see if your shell is running in such a session. Both screen and tmux by default set it to 'screen'.
if not string match --quiet -e $TERM 'screen'
<your startup scripts>
end
Note that other useful indicators are whether a shell is interactive or a login shell. You can use status --is-interactive and status --is-login to check for these two states.
In your specific case, a check for login shell might be what you are looking for:
if status --is-login
<your startup scripts>
end
See https://unix.stackexchange.com/questions/38175/difference-between-login-shell-and-non-login-shell for an explanation.

Related

source a bash script with anacron

I am learning to automate tasks using anacron by following this anacron guide. My task is to remove the saved ssh keys every day. I know this is possible using the --timeout argument, but I wanted to use a bash script and do it manually.
remove-keys.sh :
SERVICE="ssh-agent"
if pgrep -x "$SERVICE" >/dev/null
then
/usr/bin/ssh-add -D
else
:
fi
anacrontab config:
SHELL=/bin/sh
PATH=/sbin:/bin:/usr/sbin:/usr/bin
START_HOURS_RANGE=18-20
1 5 remove-keys source $HOME/.local/etc/cron.daily/remove-keys.sh
When I execute source remove-keys.sh, all identities are removed. I have given the necessary file permissions to execute. Anacron syntax test was also successful. I have used source, so the that commands executed in the script are executed as part of the current bash shell (or bash session).
I tested anacron with the following command:
anacron -fn -t $HOME/.local/etc/anacrontab -S $HOME/.var/spool/anacron
But when I look up ssh-add -L, all identities are still present.
What am I doing wrong?
EDIT 1:
Context:
I am using Ubuntu-20.04 on WSL2. Also, I am persisting the identities by using keychain to reuse ssh-agent (this is necessary when using more than one shell at a time). So, technically, the identities have an infinite timeout until I shut down WSL.
The identities in your SSH agent are specific to your login session. I don't think there is a sane way to use ssh-agent from a cron job.
Trying to manipulate your interactive environment from cron seems doomed anyway. It will fail if you are not logged in when the job runs, and have weird failure modes if you are logged in more than once.
Perhaps instead create a simple wrapper script which runs an endless loop with (say) a five-minute sleep between iterations from your desktop environment's login hooks.

Safely exec tmux upon login

I would like to execute tmux upon logging into a shell for my user. I am using fish, but I think this question is relevant to any shell. So far, I've accomplished this by following the advice in this question: https://askubuntu.com/questions/560253/automatically-running-tmux-in-fish, specifically, adding the following line to my config.fish:
test $TERM != "screen"; and exec tmux
However, I have one major issue with this approach, and that is if tmux fails to start, perhaps if I've introduced a syntax error in my .tmux.conf file, the shell process immediately exits, booting me out of the session.
Is there a way to automatically run tmux in new shell executions whereby I can:
Catch errors and fallback on a "plain" shell execution (i.e. just fish without tmux)
Not have to exit a login twice - once to quit tmux then again to quit fish
?
I imagine tmux exits with a non-zero (i.e. failing) status if there's configuration errors, so you could presumably ditch the exec and exit manually, like
if test $TERM != "screen"
tmux
and exit
end
However, do keep in mind that fish always sources all of its config files, so you'll want to wrap this inside if status --is-login or similar.
This works for me:
if status --is-login
source $HOME/.config/fish/login.fish
tmux; and exec true
end
Obviously you may or may not have a login.fish file. I like to keep my config.fish lean by putting code that might not be needed for the current session in separate files so I've also got a interactive.fish script

why is the command display=:0.0 going off?

The following command : display=:0.0 in bash script is used. Then when calling export, I can see it's there.
However, after a while, this command is no more in the export list and I have to do it again.
Note
There is only 1 session running and I'm doing nothing on the command line. There may be a program running in background.
This is a ssh session (putty).
What could be causing this?
I was logged with ssh. I realised I sometime got logged off and on automatically. Then the export command would be off because it's a "new" session. Cannot explain why this happen with putty though.

How to ssh into a shell and run a script and leave myself at the prompt

I am using elastic map reduce from Amazon. I am sshing into hadoop master node and executing a script like.
$EMR_BIN/elastic-mapreduce --jobflow $JOBFLOW --ssh < hivescript.sh . It sshes me into the master node and runs the hive script. The hivescript contains the following lines
hive
add jar joda-time-1.6.jar;
add jar EmrHiveUtils-1.2.jar;
and some commands to create hive tables. The script runs fine and creates the hive tables and everything else, but comes back to the prompt from where I ran the script. How do I leave it sshed into hadoop master node at the hive prompt.
Consider using Expect, then you could do something along these lines and interact at the end:
/usr/bin/expect <<EOF
spawn ssh ... YourHost
expect "password"
send "password\n"
send javastuff
interact
EOF
These are the most common answers I've seen (with the drawbacks I ran into with them):
Use expect
This is probably the most well rounded solution for most people
I cannot control whether expect is installed in my target environments
Just to try this out anyway, I put together a simple expect script to ssh to a remote machine, send a simple command, and turn control over to the user. There was a long delay before the prompt showed up, and after fiddling with it with little success I decided to move on for the time being.
Eventually I came back to this as the final solution after realizing I had violated one of the 3 virtues of a good programmer -- false impatience.
Use screen / tmux to start the shell, then inject commands from an external process.
This works ok, but if the terminal window dies it leaves a screen/tmux instance hanging around. I could certainly try to come up with a way to just re-attach to prior instances or kill them; screen (and probably tmux) can make it die instead of auto-detaching, but I didn't fiddle with it.
If using gnome-terminal, use its -x or --command flag (I'm guessing xterm and others have similar options)
I'll go into more detail on problems I had with this on #4
Make a bash script with #!/bin/bash --init-file as the shebang; this will cause your script to execute, then leave an interactive shell running afterward
This and #3 had issues with some programs that required user interaction before the shell is presented to them. Some programs (like ssh) it worked fine with, others (telnet, vxsim) presented a prompt but no text was passed along to the program; only ctrl characters like ^C.
Do something like this: xterm -e 'commands; here; exec bash'. This will cause it to create an interactive shell after your commands execute.
This is fine as long as the user doesn't attempt to interrupt with ^C before the last command executes.
Currently, the only thing I've found that gives me the behavior I need is to use cmdtool from the OpenWin project.
/usr/openwin/bin/cmdtool -I 'commands; here'
# or
/usr/openwin/bin/cmdtool -I 'commands; here' /bin/bash --norc
The resulting terminal injects the list of commands passed with -I to the program executed (no parms means default shell), so those commands show up in that shell's history.
What I don't like is that the terminal cmdtool provides feels so clunky ... but alas.

What is common between environments within a shell terminal session?

I have a custom shell script that runs each time a user logs in or identity is assumed, its been placed in /etc/profile.d and performs some basic env variable operations. Recently I added some code so that if screen is running it will reattach it without needing me to type anything. There are some problems however. If I log-in as root, and su - to another user, the code runs a second time. Is there a variable I can set when the code runs the first time that will prevent a second run of the code?
I thought to write something to the disk but then I dont want to prevent the code from running if I begin a new terminal session. Here is the code in question. It first attempts to reattach - if unsuccessful because its already attached (as it might be on an interruped session) it will 'take' the session back.
screen -r
if [ -z "$STY" ]; then
exec screen -dR
fi
Ultimately this bug prevents me from substituting user to another user because as soon as I do so, it grabs the screen session and puts me right back where I started. Pretty frustrating
The ${PPID} of the shell you get when you su will be the su command. So the output of
ps -o command= $PPID
will begin with the letters su, so test for this.
I think you can get this right if you read the following post (and the man for your favorite shell)
Question about login vs profile

Resources