How do I set custom bash prompt / env variables / aliases in a way that the user can restore their previous settings at any point? - bash

So I have written a tool that generates and offline replica of another system. It allows you to run commands typically run online, offline by specify a series of .mgmt and .sock files in the command.
However, I want users to be able to enter these commands as if they were on the live system. Therefore, I had it generate a script that can be sourced to set the environment variables and aliases necessary to allow the user to enter commands easily.
There are a few issues this created that I want to work around, and I am curious is there is a standard best practice when doing this.
I want the bash prompt to change (or atleast be appended to) when the user sources the new variables so it is clear they are running commands on the offline replica. I can do this by setting $PS1. However, I also want a 'deactivate' script to restore the previous user environment. How do I undo this change to the previous one?
When they source my script, env variables change that may have been previously set. I have the script store any previous variables as OLD_<env_variable_name> and create a second deactivate script that restores them as well as removes any aliases (and eventually resets the bash prompt). Is this the best way to do this, or is there a much simpler method I may be missing?

Related

How can I automate a (usually) interactive build script with Github Actions?

I'm trying to add CI to a project that uses a set of build scripts written in bash. The scripts prompt for input a few times for configuration information (setting flags, setting parameters, etc.) Does Github Actions have its own commands for dealing with this, or is there a way to set up an expect script (or something similar)?
There is currently no feature that allows prompting for manual input during workflow runs. See this response on the community forums where a similar question was asked.
Here are some options you can explore:
Redesign the scripts to read from configuration files and check them into git to trigger the build.
Use the workflow_dispatch event to create a workflow that you can manually trigger from the Actions UI and supply input parameters. See this documentation for more detail.
Use slash-command-dispatch to trigger the build using a slash command with arguments for the build's input parameters.

Can an environment variable be shared between 2 different shell types?

I came into an environment were when users log into our system, they log in with the csh by default. We also have an automation login (let's call it "autologin") that also invokes the csh by default.
This login in used to execute (via its crontab) all of our scripts (50+) used to send and receive files with our vendors. The results of these individual file transmissions are used to feed a dashboard for each transmission.
The dashboard simply has a light for each file transmission (green light if the last file transmission was successful and red if it failed). This success/fail status is set (in a SQL Server database) from the scripts, using a tsql -H connection.
We are currently using SQL Server 2008, but are upgrading to 2016. So I need to change are 50+ scripts' tsql connection from sql_2008 to sql_2016. I had the idea to use an environment variable (let's say AUTOSQL) that could use.
I could then change all of the 50+ scripts to reference AUTOSQL, instead of sql_2008, and then set the environment variable to sql_2008/sql_2016/and whatever we upgrade to in the future. As I previously mentioned, all users log in with csh as the default shell. The problem I've encountered is all of the shell scripts are written in bash.
How can I set up an environment variable for the bash (our automation) scripts to use, so when we upgrade in the future, I simply have to change the value of one environment variable, instead of changes to 50+ scripts? Thank you
Environment-variables are an operating system feature that is "application agnostic". In Unix-like environments, any kind of program can pass environment variables to any other, that is its child.
The real issue here is that the fifty scripts are run by cron from a crontab file. This means that they will not inherit the AUTOSQL variable, even if it is exported by the csh login script.
See:
Where can I set environment variables that crontab will use?
Also, on the ServerFault StackExchange:
https://serverfault.com/questions/337631/crontab-execution-doesnt-have-the-same-environment-variables-as-executing-user
It's great to see someone simplifying and consolidating their scripts.
If all these scripts are executed in cron (by root), /root is the first place I'd look.
Step 1:
Choice A.) Set and export AUTOSQL in root's .profile
Choice B.) Set and export AUTOSQL in root's .bashrc
Choice C.) Set and export AUTOSQL in root's (whatever you wanna call the file)
export AUTOSQL='sql_{year}'
Step 2:
Make sure you source this file at the top of your scripts. From now on, you can add environment switches at will to this file, since all of your scripts will source it.
. /root/.{bashrc || profile || whatever you decide to name the file}
Hope this helped! Again, the decision is yours.

How to chain alias commands in cmder

I love using aliases on my ubuntu server for repeated commands as they're a huge timesaver and they're absolutely irreplaceable for me now.
I've been using cmder a lot recently on Windows as it is the best console replacement for windows that I know of. It is a wonderful piece of software and I have almost all the basic bash commands including aliases.
However, I cannot find a way to chain multiple alias commands. I've tried delving into doskey at this link Microsoft DOSKEY and the macros without any luck.
So, basically I want to create multiple aliases. For e.g.
alias loginuser1='ssh -i ~/user1keyfile user1#$s'
alias mynewcloudserver='901.801.701.601'
and want to be able to login by typing:
loginuser1 mynewcloudserver
loginuser5 mytestingcloudserver
I have currently tried this:
loginuser1 mynewcloudserver
which produces this error:
ssh: Could not resolve hostname mynewcloudserver: no address associated with name
I get that this is because it is probably looking in my hosts file for mynewcloudserver and is unable to find an entry. I am able to login by doing this instead:
loginuser1 901.801.701.601
which brings us to my problem. I am unable to call one alias from another alias
I know the above might not be the best way to create those aliases, but I just want to understand the logic and how to chain aliases together in cmder which will open up a host of possibilities pun intended.
If anyone can help me out, that would be great.
The only option I've found is to create a myscript.sh file with the commands, and create an alias to call the file.
It may be helpful to include wait between commands if they need to finish before the next one runs.
The first time you run it, it may ask you which program to use. Choose Git for Windows.

Changing environment in Hudson, that stays for the whole build

how can I execute a batch-file or just some (e.g. twice) commands in a job of Hudson (running on windows xp, as a non-service, but may change), that the environment just stays for the whole build.
I need to do this, because I have to change the current path with 'cd' (we are using relative paths in our proj) and 'set' some environment-variables for msbuild.
Thank you in Advance.
Not sure why you need to get out of the service realm. My understanding was so far that Hudson starts a new environment for every job, so that the jobs don't interfere with each other. So if you don't use commands that effect other ennvironments (e.g. subst) you will be fine with adding a "Execute Windows Batch Command".
If your service runs with the wrong permissions, you have two options. First, change the permission of the service (run it under a different user than the local system user) or call the runas command. If for whatever reason you still need to contain changes to certain parts of your job you can always call cmd to create a new environment.

Is there a difference between setting JAVA_HOME through cmd line or GUI

This is a real noob question.
When I set up JAVA_HOME using the command line interface I used set JAVA_HOME = C:\Program Files\Java\jdk1.6.0_13
However when I open the JAVA_HOME variable from System>Advanced>Environment Variables the change is not visible. Are these two different settings?
I have this question every time I set up a new Jdk and have never fully understood why the two seem to be different.
The variable you set on command line is for that command shell and any other processes it starts. When you set it from System/Advanced/Environment Variables it affects any other process you start, including command shell, after setting it. Depending on where you set it, it will be available to the same user or any other use who logs in as well.
The JAVA_HOME you set by command line is set only for that session of the shell.
Changes made to a parent process only propagate to newly-created children; try opening a new command prompt and inspecting the value there.
What the others said... :-)
I will add that even after setting the variable in the dialog, processes already running are (in general) not aware of the change: eg. a command prompt window will still display the old value. You have to start another window to see the change.
In some (rare) cases, you might even need to log out and log back to validate the change (I saw that recently again).

Resources