Chaining commands together in .bashrc - bash

I'm trying to chain two commands together in a function or alias.
What I want to do is ssh into a proxy box, and then into another box from there. So something like:
ssh -J mylogin#host mylogin#host2
So far i've tried:
function doot {ssh -J mylogin#host && mylogin#"$1"}
and:
function doot {ssh -J mylogin#host; mylogin#"$1"}
and:
alias doot="ssh -J mylogin#host; mylogin#"$1""
It either doesn't recognize the function, or the alias just gives me an error. I feel that it's having an issue with the "$1" but i'm not sure how to chain these two commands together.
I want to just type in doot [nameofhost] and execute the command
ssh -J mylogin#host mylogin#host2

Neither of your attempted functions or alias do ssh -J mylogin#host mylogin#host2. Why?
The use of && and ';' separate commands. In your case that would make two separate commands out of ssh -J mylogin#host and mylogin#"$1". You need a single command specifying mylogin#host as the jump-host and mylogin#"$1" as the final destination. Simply do:
doot() {
ssh -J mylogin#host mylogin#"$1"
}
(note: quotes are not wrong but not entirely needed as hostnames don't contain whitespace. Also note "$1" within the function refers to the function argument, not the command line argument for your script. You would need to call as doot "$1". There is no ';' or && involved.)
There is a problem with alias as an alias does not contain arguments. From man bash:
There is no mechanism for using arguments in the replacement text.
If arguments are needed, a shell function should be used (see FUNCTIONS below).
Also, you want to validate that $1 has been given. You can do that with:
[ -z "$1" ] && { ## validate at least one argument given
printf "error: destination hostname required as argument.\n" >&2
return 1
}
(note: if you are calling the function that includes this test from the command line in the parent shell, your function should return instead of exit. In that case exit would exit the parent shell. If you use the function within a separate script you call from the parent, then you want exit to close the subshell)

You did not specify if the proxy required any additional parameters, sssuming it just a jump box, allowing ssh
Function foo {
ssh mylogin#host ssh mylogin#host2
}
If your current user is ‘mylogin’, you can abbreviate to
ssh host ssh host2

Related

Can I add a parameter in-between my command? [duplicate]

This question already has answers here:
Make a Bash alias that takes a parameter?
(24 answers)
Closed 3 years ago.
Coding is one of my weaker areas and this is my first question on Stack Overflow.
What I want to is add a parameter in-between my command, and I'm thinking it can be done with an alias or function.
The command I am using is telnet and it is used to log into our switches.
The full command:
$ telnet switchname.compname.com
What I want to type:
$ enter 'switchname'
In turn, making the telnet command a simple enter and not having to consistently type .compname.com every time.
A simple function does nicely:
enter() { telnet $1.compname.com; }
Create a function to open the telnet session to the switchname given by the first argument (or to your default switchname if no argument is provided), e.g. in .bashrc you could do:
mytelnet() {
local swname=${1:-defaultname} ## use local vars within function
telnet $swname.compname.com ## connect to your switch
}
Then create the alias you want for enter, e.g.
alias enter='mytelnet'
Now at the command line you can type:
$ enter ## to go to defaultname.compname.com
or
$ enter switchname ## to go to switchname.compname.com
For testing you can just enter the function and alias on the command line, e.g.
$ mytelnet() { local swname=${1:-defaultname}; telnet $swname.compname.com; }
$ alias enter='mytelnet'
Then telnet away...
(note: you can simply name your function enter() and do away with the alias. I just find it convenient to define my functions at the top of my .bashrc and then create aliases, as needed, in the various sections below, but using an alias is by no means a requirement)
Try to use an alias in your rcfile, like ~/.bashrc, for example.
alias mytelnet='telnet the.desired.site'
Then source your rcfile like
source ~/.bashrc
or the equivalent
. ~/.bashrc
and type mytelnet to execute the command.
Or just use a bash variable, like
VAR="the.desired.site"
and execute telnet this way:
telnet $VAR
Also you can add a function to your rc file, like David C. Rankin mentioned.
function mytelnet () {
telnet <<< "$#"
}
export -f mytelnet
<<< "$#" will feed telnet in the same way, as if you are using any other command on the commandline with their additional parameters, like in bash $1, $2 and so on.
export -f marks the function mytelnet to be passed to child processes in your environment.

Send rm-command with $variable filename via ssh [duplicate]

This question already has answers here:
is it possible to use variables in remote ssh command?
(2 answers)
Closed 4 years ago.
in a bash script i try to do:
ssh -n $username#server2 "rm ${delete_file}"
but always get the error:
rm: missing operand
when I
> echo $delete_file
> /var/www/site/myfile.txt
I get the correct path.
What am i doing wrong?
Could it be that in your case, $delete_file is set on the remote host and not on your current machine?
If you want $delete_file to be expanded on the remote side (i.e., after ssh'ing into server2), you have to use single quotes:
ssh -n $username#server2 'rm ${delete_file}'
Other than that, do you set the value of delete_file in the same script (before ssh'ing), or before invoking your script? If latter is the case, it can't work: Variables are not propagated to scripts called by the current script/session.
You could do the following about it:
delete_file=<your-value> ./ssh-script
or:
delete_file=<your-value>
export delete_file
./ssh-script
As it turns out this last option was the problem, let me elaborate on best practices:
Better than setting environment variables would be the usage of positional parameters.
#!/bin/bash
# $1: file to delete
delete_file=${1:?Missing parameter: which file for deletion?}
ssh -n $username#server2 "rm ${delete_file}"
Usage of the script is now as simple as:
./ssh-script <your-file-for-deletion>
This way, you don't have to remember which variable is exactly expected by the script when calling it - simply call the script with a positional parameter.
As a bonus, the example uses parameter expansion to check for not-set or empty parameters:
delete_file=${1:?Missing parameter: which file for deletion?}
Whenever $1 happens to be unset or empty, the scripts exits immediately with exit code 1 and prints given message to stderr.

Passing variable from one script to another in unix aix

I am passing variable from one shell script to another which is being executed on another remote server.
Script 1
echo "Identification No."
read id
export id
ssh atul#10.95.276.286 'bash -s' < data_file.sh
Script 2
echo "ID is ---- "$id
cd /abc/xyz/data/
cat data_abcxyz.txt|grep '|$id|'|wc -l
By this way I am not able to get any output even the id is also null in the second script.
I have also tried
ssh atul#10.95.276.286 'bash -s' < data_file.sh "$id"
But got no output.
Any help on this is greatly appreciated. I am using unix AIX.
export on one host is absolutely not going to affect an entirely different host... it doesn't even affect another shell running on the current host.
Your second attempt is better and might even work if your script were checking for positional arguments but it isn't. (It might not even work in that case as I'm not at all sure that the command line argument would make it through to the script through ssh and bash -s.
You might be able to do something more like:
ssh atul#10.95.276.286 "bash -s $id" < data_file.sh
to pass the argument to the remote bash directly but your script would still need to use positional arguments and not expecting named variables to already exist.
Exporting won't have any effects on the environment of remote scripts.
You can set up a remote script's environment by specifying the env variables on the command line before the actual command, which you can btw use for local commands too.
ssh atul#10.95.276.286 "id=$id bash -s" < data_file.sh
If you pass "$id" this way:
ssh atul#10.95.276.286 'bash -s' < data_file.sh "$id"
It'll be your script's first parameter, AKA "$1" and you'll be able to access it from your script that way.
Note that '|$id|' in your "Script 2" will be interpreted as a literal string, since you're using single quotes.

What is the bash variable $COMP_LINE and what does it do?

What is the $COMP_LINE variable in bash scripting? The Bash Reference Manual has the following to say.
$COMP_LINE
The current command line. This variable is available only in shell functions and external commands invoked by the programmable completion facilities (see Programmable Completion).
I don't understand what 'the current command line' means.
I am trying to pick apart this script: to see how it manages to intercept bash commands.
hook() {
echo "$#"
}
invoke_hook() {
[ -n "$COMP_LINE" ] && return
[ "$BASH_COMMAND" = "$PROMPT_COMMAND" ] && return
local command=`history 1 | sed -e "s/^[ ]*[0-9]*[ ]*//g"`;
hook "$command"
}
trap 'invoke_hook' DEBUG
I am running into trouble figuring out what the following line is supposed to do.
[ -n "$COMP_LINE" ] && return
I assume it some sort of check or test before you run the rest of the script, since [] is an alias for the bash test command, but since I can't read it I can't figure out what it's supposed to be testing.
In Bash, if you have a file myfile.txt, you can edit it with nano myfiTab. This completes the filename automatically to save you typing, turning the command into nano myfile.txt. This is known as filename completion.
However, not all commands accept filenames. You may want to be able to do ssh myhoTab and have it complete to ssh myhostname.example.com.
Since bash can't possibly be expected to maintain this logic for all known and unknown commands across all systems, it has programmable completion.
With programmable completion, you can define a shell function to call that will get all hostnames from .ssh/known_hosts and make them available as completion entries.
When this function is invoked, it can examine the variable $COMP_LINE to see the command line it should give suggestions for. If you have set up complete -F myfunction ssh and type ssh myhoTab, then myfunction will run and $COMP_LINE will be set to ssh myho.
This functionality is used by your snippet to make the interceptor ignore commands run as a result of pressing Tab. Here it is with comments:
# This is a debug hook which will run before every single command executed
# by the shell. This includes the user's command, but also prompt commands,
# completion handlers, signal handlers and others.
invoke_hook() {
# If this command is run because of tab completion, ignore it
[ -n "$COMP_LINE" ] && return
# If the command is run to set up the prompt or window title, ignore it
[ "$BASH_COMMAND" = "$PROMPT_COMMAND" ] && return
# Get the last command from the shell history
local command=`history 1 | sed -e "s/^[ ]*[0-9]*[ ]*//g"`;
# Run the hook with that command
hook "$command"
}

Running bash function in command of su

In my bash script, I execute some commands as another user. I want to call a bash function using su.
my_function()
{
do_something
}
su username -c "my_function"
The above script doesn't work. Of course, my_function is not defined inside su. One idea I have is to put the function into a separate file. Do you have a better idea that avoids making another file?
You can export the function to make it available to the subshell:
export -f my_function
su username -c "my_function"
You could enable 'sudo' in your system, and use that instead.
You must have the function in the same scope where you use it. So either place the function inside the quotes, or put the function to a separate script, which you then run with su -c.
Another way could be making cases and passing a parameter to the executed script.
Example could be:
First make a file called "script.sh".
Then insert this code in it:
#!/bin/sh
my_function() {
echo "this is my function."
}
my_second_function() {
echo "this is my second function."
}
case "$1" in
'do_my_function')
my_function
;;
'do_my_second_function')
my_second_function
;;
*) #default execute
my_function
esac
After adding the above code run these commands to see it in action:
root#shell:/# chmod +x script.sh #This will make the file executable
root#shell:/# ./script.sh #This will run the script without any parameters, triggering the default action.
this is my function.
root#shell:/# ./script.sh do_my_second_function #Executing the script with parameter
this function is my second one.
root#shell:/#
To make this work as you required you'll just need to run
su username -c '/path/to/script.sh do_my_second_function'
and everything should be working fine.
Hope this helps :)

Resources