How to get a bash variable from inside postgre's? - bash

I'm kind of new in bash script and postgresql.
I saw in another question a way to run a bash script as psql user here.
I tried making a bash function as follow,
postgres_create_db(){
sudo su postgres <<- EOF
if psql -lqt | cut -d \| -f 1 | grep -qw nokia_aaa_poc_db; then
psql -c '\dt'
else
psql -c 'CREATE DATABASE nokia_AAA_poc_db;'
fi
EOF
exit
}
where this function will be called further in code, but I wonder if I can add a RETURN to the function that's actualy returning a varible that was first declared inside postgres bash (in between the EOF's). Like bellow:
postgres_create_db(){
sudo su postgres <<- EOF
if psql -lqt | cut -d \| -f 1 | grep -qw nokia_aaa_poc_db; then
psql -c '\dt'
exists=1 #where thats a variable that I want to access outside the postgres bash.
else
psql -c 'CREATE DATABASE nokia_AAA_poc_db;'
fi
EOF
exit
return exists
}
but it gives an error on shellcheck
return exists
^-- SC2152: Can only return 0-255. Other data should be written to stdout.

Functions in bash can only return values from 0 to 255 where 0 is success. Reference: Return value in a Bash function
So you can echo the variable like this instead:
#!/usr/bin/env bash
postgres_test() {
psql -c '\dt' &> /dev/null
declare exists=1
echo $exists
}
printf "%s\n" "$(postgres_test)"
This prints "1".
You'll also notice that I redirected the output of the Postgres command to /dev/null. This is because it would be combined in the function's output otherwise.
You might wish to redirect that output to a file instead.

Related

Bash read command with cat and pipe

I have two scripts:
install.sh
#!/usr/bin/env bash
./internal_install.sh
internal_install.sh
#!/usr/bin/env bash
set -x
while true; do
read -p "Hello, what's your name? " name
echo $name
done
When I run ./install.sh, all works as expected:
> ./install.sh
+ true
+ read -p 'Hello, what'\''s your name? ' name
Hello, what's your name? Martin
+ echo Martin
Martin
...
However, when I run with cat ./install.sh | bash, the read function does not block:
cat ./install.sh | bash
+ true
+ read -p 'Hello, what'\''s your name? ' name
+ echo
+ true
+ read -p 'Hello, what'\''s your name? ' name
+ echo
...
This is just a simplified version of using curl which results in the same issue:
curl -sl https://www.conteso.com/install.sh | bash
How can I use curl/cat to have blocking read in the internal script?
read reads from standard input by default. When you use the pipe, standard input is the pipe, not the terminal.
If you want to always read from the terminal, redirect the read input to /dev/tty.
#!/usr/bin/env bash
set -x
while true; do
read -p "Hello, what's your name? " name </dev/tty
echo $name
done
But you could instead solve the problem by giving the script as an argument to bash instead of piping.
bash ./install.sh
When using curl to get the script, you can use process substitution:
bash <(curl -sl https://www.conteso.com/install.sh)

Why can't pass the variable's value into file in /etc directory?

I want to pass the value of the $ip variable into the file /etc/test.json with bash.
ip="xxxx"
sudo bash -c 'cat > /etc/test.json <<EOF
{
"server":"$ip",
}
EOF'
I expect the content of /etc/test.json to be
{
"server":"xxxx",
}
However the real content in /etc/test.json is:
{
"server":"",
}
But if I replace the target directory /etc/ with /tmp
ip="xxxx"
cat > /tmp/test.json <<EOF
{
"server":"$ip",
}
EOF
the value of the $ip variable gets passed into /tmp/test.json:
$ cat /tmp/test.json
{
"server":"xxxx",
}
In Kamil Cuk's example, the subprocess is cat > /etc/test.json which contains no variable.
sudo sh -c 'cat > /etc/test.json' << EOF
{
"server":"$ip",
}
EOF
It does not export the $ip variable at all.
Now let's make an analysis for the following:
ip="xxxx"
sudo bash -c "cat > /etc/test.json <<EOF
{
"server":\""$ip"\",
}
EOF"
The different parts in
"cat > /etc/test.json <<EOF
{
"server":\""$ip"\",
}
EOF"
will concatenate into a long string and as a command .Why can the $ip variable inherit the value from its father process here?
There are two reasons for this behavior:
Per default, variables are no passed to the environment of subsequently executed commands.
The variable is not expanded in the current context, because your command is wrapped in single quotes.
Exporting the variable
Place an export statement before the variable, see man 1 bash
The supplied names are marked for automatic export to the environment of subsequently executed commands.
And as noted by Léa Gris you also need to tell sudo to preserve the environment with the -E or --preserve-environment flag.
export ip="xxxx"
sudo -E bash -c 'cat > /etc/test.json <<EOF
{
"server":"$ip",
}
EOF'
Expand the variable in the current context:
This is the reason your second command works, you do not have any quotes around the here document in this example.
But if I replace the target directory /etc/ with /tmp [...] the value of the $ip variable gets passed into /tmp/test.json
You can change your original snippet by replacing the single quotes with double quotes and escaping the quotes around your ip:
ip="xxxx"
sudo bash -c "cat > /etc/test.json <<EOF
{
"server":\""$ip"\",
}
EOF"
Edit: For your additional questions:
In Kamil Cuk's example, the subprocess is cat > /etc/test.json which contains no variable.
sudo sh -c 'cat > /etc/test.json' << EOF
{
"server":"$ip",
}
EOF
It does not export the $ip variable at all.
Correct and you did not wrap the here document in single quotes. Therefore $ip is substituted in the current context and the string passed to subprocesses standard input is
{
"server":"xxxx",
}
So in this example the subprocess does not need to know the $ip variable.
Simple example
$ x=1
$ sudo -E sh -c 'echo $x'
[sudo] Password for kalehmann:
This echos nothing because
'echo $x' is wrapped in single quotes. $x is therefore not substituted in the current context
$x is not exported. Therefore the subprocess does not know its value.
$ export y=2
$ sudo -E sh -c 'echo $y'
[sudo] Password for kalehmann:
2
This echos 2 because
'echo $y' is wrapped in single quotes. $x is therefore not substituted in the current context
$y is exported. Therefore the subprocess does know its value.
$ z=3
$ sudo -E sh -c "echo $z"
[sudo] Password for kalehmann:
3
This echos 3 because
"echo $z" is wrapped in double quotes. $z is therefore substituted in the current context
There little need to do the here document inside the subshell. Just do it outside.
sudo tee /etc/test.json <<EOF
{
"server":"$ip",
}
EOF
or
sudo sh -c 'cat > /etc/test.json' << EOF
{
"server":"$ip",
}
EOF
Generally, it is not safe to build a fragment of JSON using string interpolation, because it requires you to ensure the variables are properly encoded. Let a tool like jq to that for you.
Pass the output of jq to tee, and use sudo to run tee to ensure that the only thing you do as root is open the file with the correct permissions.
ip="xxxx"
jq --arg x "$ip" '{server: $x}' | sudo tee /etc/test.json > /dev/.null

Set a command to a variable in bash script problem

Trying to run a command as a variable but I am getting strange results
Expected result "1" :
grep -i nosuid /etc/fstab | grep -iq nfs
echo $?
1
Unexpected result as a variable command:
cmd="grep -i nosuid /etc/fstab | grep -iq nfs"
$cmd
echo $?
0
It seems it returns 0 as the command was correct not actual outcome. How to do this better ?
You can only execute exactly one command stored in a variable. The pipe is passed as an argument to the first grep.
Example
$ printArgs() { printf %s\\n "$#"; }
# Two commands. The 1st command has parameters "a" and "b".
# The 2nd command prints stdin from the first command.
$ printArgs a b | cat
a
b
$ cmd='printArgs a b | cat'
# Only one command with parameters "a", "b", "|", and "cat".
$ $cmd
a
b
|
cat
How to do this better?
Don't execute the command using variables.
Use a function.
$ cmd() { grep -i nosuid /etc/fstab | grep -iq nfs; }
$ cmd
$ echo $?
1
Solution to the actual problem
I see three options to your actual problem:
Use a DEBUG trap and the BASH_COMMAND variable inside the trap.
Enable bash's history feature for your script and use the hist command.
Use a function which takes a command string and executes it using eval.
Regarding your comment on the last approach: You only need one function. Something like
execAndLog() {
description="$1"
shift
if eval "$*"; then
info="PASSED: $description: $*"
passed+=("${FUNCNAME[1]}")
else
info="FAILED: $description: $*"
failed+=("${FUNCNAME[1]}")
done
}
You can use this function as follows
execAndLog 'Scanned system' 'grep -i nfs /etc/fstab | grep -iq noexec'
The first argument is the description for the log, the remaining arguments are the command to be executed.
using bash -x or set -x will allow you to see what bash executes:
> cmd="grep -i nosuid /etc/fstab | grep -iq nfs"
> set -x
> $cmd
+ grep -i nosuid /etc/fstab '|' grep -iq nfs
as you can see your pipe | is passed as an argument to the first grep command.

How do I pass subshell results (array) to an SSH command?

Trying it this way:
#!/bin/bash
myvals=`psql -d mydb -c "select id from table1 where 't'"`
ssh user1#host1.domain.tld "for i in $myvals; do echo \$i >> values; done"
As long as psql returns just one value, it works fine. But if its several values, I receive this response:
bash: -c: line 1: syntax error near unexpected token `2'
bash: -c: line 1: `2'
Also, I tried to:
myvals='1 2 3'
And then it works fine: the values 1 2 3 are appended to the "values" file on the remote host; no error mesages.
If I try another subshell command, such as myvals=ls /bin, errors reappear.
It's clear that $myvals is evaluated on the local host already but what makes the subshell results so different?
If It's Not Really An Array...
Iterating over a string as if it were an array is innately buggy. Don't do it. That said, to generate a safely-escaped (eval-safe) version of your value, use printf %q.
#!/bin/bash
myvals=`psql -d mydb -c "select id from table1 where 't'"`
printf -v myvals_q %q "$myvals"
ssh user1#host1.domain.tld \
"myvals=$myvals_q;"' for i in $myvals; do echo "$i"; done >>values'
If You Actually Had An Array
#!/bin/bash
readarray -t myvals < <(psql -d mydb -c "select id from table1 where 't'")
printf -v myvals_q '%q ' "${myvals[#]}"
ssh user1#host1.domain.tld \
"myvals=( $myvals_q );"' for i in "${myvals[#]}"; do echo "$i"; done >>values'
If You Don't Need To Store The Value Locally In The First Place
#!/bin/bash
ssh user1#host1.domain.tld \
'while read -r i; do echo "$i"; done >>values' \
< <(psql -d mydb -c "select id from table1 where 't'")
General Notes
Running echo "$i" >>values over and over in a loop is inefficient: Every time the line is run, it re-opens the values file. Instead, run the redirection >values over the whole loop; this truncates the file exactly once, at the loop's start, and appends all values generated therein.
Unquoted expansions are generally dangerous. For example, if foo='*', then $foo will be replaced with a list of files in the current directory, but "$foo" will emit the exact contents -- *. Similarly, tabs, whitespace runs, and various other contents can be unintentionally damaged by unquoted expansion, even when passing directly to echo.
You can switch quoting types in the same string -- thus, "$foo"'$foo' is one string, the first part of which is replaced with the value of the variable named foo, and the second component of which is the exact string $foo.
You can send the output as a file:
#!/bin/bash
psql -d mydb -c "select id from table1 where 't'" > /tmp/values
scp values user1#host1.domain.tld:/tmp/
or pipe it to the remote host:
psql -d mydb -c "select id from table1 where 't'" | \
ssh user1#host1.domain.tld 'while read line; do echo $line; done'

BASH - how echo works inside EOF tags

I would like to execute the followings:
PASSWORD="mypassword"
RUNCOMMAND=$(cat <<EOF
echo $PASSWORD | sudo -S sudo echo "this is it babe"
EOF
)
But instead of this is it babe, I get the following result:
mypassword | sudo -S sudo echo "this is it babe"
I tried with cat <<\EOF, cat <<'EOF' still no luck.
Any ideas?
You are confusing a heredoc with a pipeline.
heredoc with variable expansion:
cat <<EOF
some text, possibly with variables: ${HOME} / $(whoami)
EOF
some text, possibly with variables: /home/attie / attie
heredoc without variable expansion:
cat <<"EOF"
some text, possibly with variables: ${HOME} / $(whoami)
EOF
some text, possibly with variables: ${HOME} / $(whoami)
pipeline with variable expansion (note the quotes, "):
echo "some text, possibly with variables: ${HOME} / $(whoami)" | cat
some text, possibly with variables: /home/attie / attie
pipeline without variable expansion (note the quotes, '):
echo 'some text, possibly with variables: ${HOME} / $(whoami)' | cat
some text, possibly with variables: ${HOME} / $(whoami)
${...} expands an environment variable
$(...) runs a command, and substitutes its stdout
It also looks like you're trying to have your password entered into sudo - this won't work, as sudo will repoen the terminal to acquire your password, before passing it's stdin to the final application.
You are starting from a false premise, that eval $RUNCOMMAND is something you should do. It is not; variables are for data, functions are for code.
run_command () {
docker_run_options=(
--restart=always
--name "${USER_NAME}_$(date +%Y%m%d-%H%M%S)"
-d
-e "VIRTUAL_HOST=$USER_VIRTUAL_HOST"
-e "VIRTUAL_PORT=$USER_VIRTUAL_PORT"
-e "PORT=$USER_VIRTUAL_PORT"
-p "$USER_VIRTUAL_PORT:$USER_VIRTUAL_PORT"
)
echo "$1" | sudo -S sudo docker run "${docker_run_options[#]}" "$USER_IMAGE"
}
fun_run_command () {
run_command "PASSWORD"
}
The final solution is rather simple:
PASSWORD="mypassword"
RUNCOMMAND=$(cat <<EOF
echo $PASSWORD | sudo -S sudo echo "this is it babe"
EOF
)
And execute it via eval:
eval $RUNCOMMAND
Sorry for stealing your times with this obvious problem guys:)
The usecase for the above is to echo a given command before really executing it.
Like this:
fun_run_command(){
# execute the final command
echo `eval $RUNCOMMAND`
}
fun_echo_command(){
# echo the command which will be launched (fun_run_command())
echo ${RUNCOMMAND//$PASSWORD/PASSWORD}
}
RUNCOMMAND=$(cat <<EOF
echo $PASSWORD | sudo -S sudo docker run --restart=always \
--name ${USER_NAME}_`date +%Y%m%d-%H%M%S` \
-d \
-e "VIRTUAL_HOST=$USER_VIRTUAL_HOST" \
-e "VIRTUAL_PORT=$USER_VIRTUAL_PORT" \
-e "PORT=$USER_VIRTUAL_PORT" \
-p $USER_VIRTUAL_PORT:$USER_VIRTUAL_PORT \
$USER_IMAGE
EOF
)
As you can see the command what I launch is quite long,
so it is always make sense to doublecheck what is executed by the script.
Having copy&paste the same command to multiple function is prone to error.

Resources