ksh remote function calling another remote function2 - shell

I am having problem to run simple code below:
#!/bin/ksh
set -x
function test_me
{
set -x
date
}
function check_me
{
set -x
ssh ${HST2} "$(typeset -f test_me); test_me"
}
ssh ${HST1} "$(typeset -f); check_me"
Fails with syntax error at line 5: `;;' unexpected

Though I can't explain why this gives you the particular error message you see, at least I see that your code can't work:
First you run in one process (on HST1) the commands
function check_me
{
set -x
ssh ${HST2} "$(typeset -f test_me); test_me"
};check_me
On HST1 defines only the function check_me, nothing else. Then you run this check_me.
Inside this function, you refer to two things: A variable HST2, and a function test_me. There is nothing in your code which would define these entities. They are only defined in that initial process, where you do the SSH, but they are not defined on the remote process on the host $HST1.
Moreover, you don't even run a ksh on $HST1. At least I don't see and ksh invocation in your code. Actually, you just pass a function .... to $HST1, and function is not an executable.

#!/bin/ksh
set -x
function second_me
{
set -x
date
}
function first_me
{
set -x
ssh hostname2 "$(typeset -f second_me); second_me"
}
#from local_machine
ssh hostname1 "$(typeset -f); first_me"
above Fails with syntax error at line X: `;;' unexpected
But If I add extra dummy function third_me, things work fine, looks like ksh bug?
#!/bin/ksh
set -x
function third_me
{
set -x
date
}
function second_me
{
set -x
date
}
function first_me
{
set -x
ssh hostname2 "$(typeset -f second_me); second_me"
}
#from local_machine
ssh hostname1 "$(typeset -f); first_me"
Code works fine

Another work around using sub-function:
#!/bin/ksh
set -x
function first_me
{
set -x
function second_me
{
set -x
date
}
ssh hostname2 "$(typeset -f second_me); second_me"
}
#from local_machine
ssh hostname1 "$(typeset -f); first_me"

Related

How to auto-complete a multi-level command aliased to a single command?

Say I have two bash functions:
dock() { sudo docker $# ;}
and
dock-ip() { sudo docker inspect --format '{{ .NetworkSettings.IPAddress }}' $# ;}
How to get bash auto-completion working with the second function?
With the first one, it is as easy as adding:
_completion_loader docker; complete -F _docker dock
This will not work for the second one. The autocomplete source for Docker is in /usr/share/bash-completion/completions/docker on Debian Stretch. I have more functions like dock-run, dock-exec, etc. so I don't want to write a custom completion function for each of them.
Also, complete -F _docker_container_inspect dock-ip only partially works; tab only lists containers, not completes partial strings.
Research:
How do I autocomplete nested, multi-level subcommands? <-- needs custom functions
https://superuser.com/questions/436314/how-can-i-get-bash-to-perform-tab-completion-for-my-aliases <-- automated for top commands only
With a combined hour of bash completion experience, I took apart the docker completion script (/usr/share/bash-completion/completions/docker) and the bash_completion.sh script to come up with a wrapper function:
# Usage:
# docker_alias_completion_wrapper <completion function> <alias/function name>
#
# Example:
# dock-ip() { docker inspect --format '{{ .NetworkSettings.IPAddress }}' $# ;}
# docker_alias_completion_wrapper __docker_complete_containers_running dock-ip
function docker_alias_completion_wrapper {
local completion_function="$1";
local alias_name="$2";
local func=$(cat <<EOT
# Generate a new completion function name
function _$alias_name() {
# Start off like _docker()
local previous_extglob_setting=\$(shopt -p extglob);
shopt -s extglob;
# Populate \$cur, \$prev, \$words, \$cword
_get_comp_words_by_ref -n : cur prev words cword;
# Declare and execute
declare -F $completion_function >/dev/null && $completion_function;
eval "\$previous_extglob_setting";
return 0;
};
EOT
);
eval "$func";
# Register the alias completion function
complete -F _$alias_name $alias_name
}
export -f docker_alias_completion_wrapper
I then created my alias/functions like this:
# Get container IP
dock-ip() { docker inspect --format '{{ .NetworkSettings.IPAddress }}' $# ;}
docker_alias_completion_wrapper __docker_complete_containers_running dock-ip
# Execute interactive container
dock-exec() { docker exec -i -t --privileged $# ;}
docker_alias_completion_wrapper __docker_complete_containers_all dock-exec
...
Be sure to call _completion_loader docker; at the top of your profile aliases script to load the main Docker completion scripts. I invite more skilled bash programmers to improve this answer, please.

Jenkins pipeline undefined variable

I'm trying to build a Jenkins Pipeline for which a parameter is
optional:
parameters {
string(
name:'foo',
defaultValue:'',
description:'foo is foo'
)
}
My purpose is calling a shell script and providing foo as argument:
stages {
stage('something') {
sh "some-script.sh '${params.foo}'"
}
}
The shell script will do the Right Thing™ if the provided value is the empty
string.
Unfortunately I can't just get an empty string. If the user does not provide
a value for foo, Jenkins will set it to null, and I will get null
(as string) inside my command.
I found this related question but the only answer is not really helpful.
Any suggestion?
OP here realized a wrapper script can be helpful… I ironically called it junkins-cmd and I call it like this:
stages {
stage('something') {
sh "junkins-cmd some-script.sh '${params.foo}'"
}
}
Code:
#!/bin/bash
helpme() {
cat <<EOF
Usage: $0 <command> [parameters to command]
This command is a wrapper for jenkins pipeline. It tries to overcome jenkins
idiotic behaviour when calling programs without polluting the remaining part
of the toolkit.
The given command is executed with the fixed version of the given
parameters. Current fixes:
- 'null' is replaced with ''
EOF
} >&2
trap helpme EXIT
command="${1:?Missing command}"; shift
trap - EXIT
typeset -a params
for p in "$#"; do
# Jenkins pipeline uses 'null' when the parameter is undefined.
[[ "$p" = 'null' ]] && p=''
params+=("$p")
done
exec $command "${params[#]}"
Beware: prams+=("$p") seems not to be portable among shells: hence this ugly script is running #!/bin/bash.

How to pass dynamic parameters in curl request

I am passing dynamic value to testing method and executing the curl request. There is an issue with $PARAMETERS.
When I execute the following method, I get error as below
Error:-
curl: option -F: requires parameter
curl: try 'curl --help' or 'curl --manual' for more information
Function:-
testing() {
local URL=$1
local HTTP_TYPE=$2
local PARAMETERS=$3
# store the whole response with the status at the and
HTTP_RESPONSE=$(curl -w "HTTPSTATUS:%{http_code}" -X $HTTP_TYPE $URL $PARAMETERS)
echo $HTTP_RESPONSE
}
URL='https://google.com'
HTTP_TYPE='POST'
PARAMETERS="-F 'name=test' -F 'query=testing'"
result=$(testing $URL $HTTP_TYPE $PARAMETERS)
FYI, above is a sample method and am using shell script. Kindly tell me how to solve this?
Since you are passing multiple command arguments in your 3rd argument, shell will do splitting as you are using unquoted variable inside your function.
You should better use shell array to store PARAMETERS argument.
Have your function as:
testing() {
local url="$1"
local httpType="$2"
shift 2 # shift 2 times to discard first 2 arguments
local params="$#" # store rest of them in a var
response=$(curl -s -w "HTTPSTATUS:%{http_code}" -X "$httpType" "$url" "$params")
echo "$response"
}
and call it as:
url='https://google.com'
httpType='POST'
params=(-F 'name=test' -F 'query=testing')
result=$(testing "$url" "$httpType" "${params[#]}")
Additionally you should avoid using all-caps variable names in your shell script to avoid clashes with env variables.
In case you're not using bash, you can also use:
params="-F 'name=test' -F 'query=testing'"
result=$(testing "$url" "$httpType" "$params")
Code Demo

How to set flag in bash function for setting aliases?

This is the command i want to execute
watchify files/[filename].js -t hbsfy -o out/[filename].js
i tried this
javascriptCompile() {
foo="src/js/files/$1.js -t hbsfy -o src/js/out/$1.js"
watchify "$foo"
}
alias jc=javascriptCompile
jc main
I get this error
You MUST specify an outfile with -o.
It needs to be an array:
javascriptCompile () {
foo=(src/js/files/"$1".js -t hbsfy -o src/js/out/"$1".js)
watchify "${foo[#]}"
}
alias jc=javascriptCompile
jc main

Expect fails but I don't see why

I have a bash script that gets info from Heroku so that I can pull a copy of my database. That script works fine in cygwin. But to run it in cron it halts because the shell that it uses halts as Heroku's authentication through Heroku Toolbelt.
Here is my crontab:
SHELL=/usr/bin/bash
5 8-18 * * 1-5 /cygdrive/c/Users/sam/work/push_db.sh >>/cygdrive/c/Users/sam/work/output.txt
I have read the Googles and the man page within cygwin to come up with this addition:
#!/usr/bin/bash
. /home/sam.walton/.profile
echo $SHELL
curl -H "Accept: application/vnd.heroku+json; version=3" -n https://api.heroku.com/
#. $HOME/.bash_profile
echo `heroku.bat pgbackups:capture --expire`
#spawn heroku.bat pgbackups:capture --expire
expect {
"Email:" { send -- "$($HEROKU_LOGIN)\r"}
"Password (typing will be hidden):" { send -- "$HEROKU_PW\r" }
timeout { echo "timed out during login"; exit 1 }
}
sleep 2
echo "first"
curl -o latest.dump -L "$(heroku.bat pgbackups:url | dos2unix)"
Here's the output from the output.txt
/usr/bin/bash
{
"links":[
{
"rel":"schema",
"href":"https://api.heroku.com/schema"
}
]
}
Enter your Heroku credentials. Email: Password (typing will be hidden): Authentication failed. Enter your Heroku credentials. Email: Password (typing will be hidden): Authentication failed. Enter your Heroku credentials. Email: Password (typing will be hidden): Authentication failed.
As you can see it appears that the output is not getting the result of the send command as it appears it's waiting. I've done so many experiments with the credentials and the expect statements. All stop here. I've seen few examples and attempted to try those out but I'm getting fuzzy eyed which is why I'm posting here. What am I not understanding?
Thanks to comments, I'm reminded to explicitly place my env variables in .bashrc:
[[ -s $USERPROFILE/.pik/.pikrc ]] && source "$USERPROFILE/.pik/.pikrc"
export HEROKU_LOGIN=myEmailHere
export HEROKU_PW=myPWhere
My revised script per #Dinesh's excellent example is below:
. /home/sam.walton/.bashrc echo $SHELL echo $HEROKU_LOGIN curl -H "Accept: application/vnd.heroku+json; version=3" -n https://api.heroku.com/
expect -d -c " spawn heroku.bat pgbackups:capture --expire --app gw-inspector expect {
"Email:" { send -- "myEmailHere\r"; exp_continue}
"Password (typing will be hidden):" { send -- "myPWhere\r" }
timeout { puts "timed out during login"; exit 1 } } " sleep 2 echo "first"
This should work but while the echo of the variable fails, giving me a clue that the variable is not being called, I am testing hardcoding the variables directly to eliminate that as a variable. But as you can see by my output not only is the echo yielding nothing, there is no clue that any diagnostics are being passed which makes me wonder if the script is even being called to run from expect, as well as the result of the spawn command. To restate, the heroku.bat command works outside the expect closure but the results are above. The result of the command directly above is:
/usr/bin/bash
{
"links":[
{
"rel":"schema",
"href":"https://api.heroku.com/schema"
}
]
}
What am I doing wrong that will show me diagnostic notes?
If you are going to use the expect code inside your bash script, instead of calling it separately, then you should have use the -c flag option.
From your code, I assume that you have the environmental variables HEROKU_LOGIN and HEROKU_PW declared in the bashrc file.
#!/usr/bin/bash
#Your code here
expect -c "
spawn <your-executable-process-here>
expect {
# HEROKU_LOGIN & HEROKU_PW will be replaced with variable values.
"Email:" { send -- "$HEROKU_LOGIN\r";exp_continue}
"Password (typing will be hidden):" { send "$HEROKU_PW\r" }
timeout { puts"timed out during login"; exit 1 }
}
"
#Your further bash code here
You should not use echo command inside expect code. Use puts instead. The option of spawning the process inside expect code will be more robust than spawning it outside.
Notice the use of double quotes with the expect -c flag. If you use single quotes, then bash script won't do any form of substitution. So, if you need bash variable substitution, you should use double quotes for the expect with -c flag.
To know about usage of -c flag, have a look at here
If you still have any issue, you can debug by appending -d with the following way.
expect -d -c "
our code here
"

Resources