Bash: execute a command with quotes and put its output into a variable - bash

In a bash script I set a command this way:
getdblist_cmd=(sudo -u $backup_user $psql -p $pgsql_port -U $pgsql_user -d postgres -q -t -c 'SELECT datname from pg_database')
Then I run it with
dblist=`${getdblist_cmd[#]}`
I don't get any error into the bash script but the $dblist variable is empty.
How can I execute that command and put its output into a variable?

dblist=`"${getdblist_cmd[#]}"`
^ ^
The [#] operator works only correct if the ${...} is surrounded by quotes.

Related

Unable to execute ssh command containing parentheses with perl: "syntax error near unexpected token `('"

If I run this command from the command line, it works as expected on the remote server:
ssh admin#example.com "docker exec -it -d tasks_live_work_ec2_test_server /bin/sh -c \"/usr/bin/nvim -c 'silent! call SetupInstantServer()'\""
However, if I try to execute it from a perl script, I get errors:
my $cmd = qq|docker exec -it -d ${image_name}_server /bin/sh -c \"/usr/bin/nvim -c 'silent! call SetupInstantServer()'\"|;
`ssh admin\#example.com "$cmd"`;
bash: -c: line 1: syntax error near unexpected token '('`
Escaping the parens with backslashes suppresses the error, but the SetupInstantServer function in vim never gets called.
What I would do, using 2 here-doc:
#!/usr/bin/perl
system<<PerlEOF;
ssh admin\#example.com<<ShellEOF
docker exec -it -d ${image_name}_server /bin/sh -c "
/usr/bin/nvim -c 'silent! call SetupInstantServer()'
"
ShellEOF
PerlEOF
You can decide to add quotes on a 'HereDoc' to prevent shell expansion or the need to escape #. Up to your needs.
Check perldoc perlop#Quote-and-Quote-like-Operators

Escape apostrophe in docker command

I`m trying to add variable using docker CLI command in following way:
docker exec -u root airflowdags_webserver_1 bash -c "airflow variables --set my_var '{\"test\": \"test\'2\"}'"
But getting following error:
bash: -c: line 0: unexpected EOF while looking for matching `"'
bash: -c: line 1: syntax error: unexpected end of file
I have no any errors if doing one of these commands:
docker exec -u root airflowdags_webserver_1 bash -c "airflow variables --set my_var '{\"test\": \"test\`2\"}'"
or
docker exec -u root airflowdags_webserver_1 bash -c "airflow variables --set my_var '{\"test\": \"test2\"}'"
How can I escape apostrophe in the "test'2" value to avoid the error?
A bash single quoted string cannot contain a single quote. You can't escape it. (ref
https://www.gnu.org/software/bash/manual/bashref.html#Single-Quotes)
Try this:
bash -c "airflow variables --set my_var '{\"test\": \"test'\''2\"}'"
# .......................................1.................1..2....2
I numbered the matching single quoted strings. In between is a literal single quote.
With version 3 you can specify an array of commands.
https://docs.docker.com/compose/compose-file/#command
You can use $ before the first single quote to be able to use escaped single quotes in the command.
So:
docker exec -u root airflowdags_webserver_1 bash -c "airflow variables --set my_var $'{\"test\": \"test\'2\"}'"

Echo variable using sudo bash -c 'echo $myVariable' - bash script

I want to echo a string into the /etc/hosts file. The string is stored in a variable called $myString.
When I run the following code the echo is empty:
finalString="Hello\nWorld"
sudo bash -c 'echo -e "$finalString"'
What am I doing wrong?
You're not exporting the variable into the environment so that it can be picked up by subprocesses.
You haven't told sudo to preserve the environment.
\
finalString="Hello\nWorld"
export finalString
sudo -E bash -c 'echo -e "$finalString"'
Alternatively, you can have the current shell substitute instead:
finalString="Hello\nWorld"
sudo bash -c 'echo -e "'"$finalString"'"'
You can do this:
bash -c "echo -e '$finalString'"
i.e using double quote to pass argument to the subshell, thus the variable ($finalString) is expanded (by the current shell) as expected.
Though I would recommend not using the -e flag with echo. Instead you can just do:
finalString="Hello
World"

parenthesis in a directory name in bash script

I get the error executing bash syntax error near unexpected token `('
I know the error is caused by the ')' but I thought placing the commands in-between ' ' is suppose to allow the parenthesis in a directory name. How can I fix this without renaming the name?
bash -c 'cd /tmp/h1/clients/04212015142432811_Fs_1000_ahh/pls/03sox_a_Fs_1000_ahh_(000_bit)_(0.0000
0sig_in_deg)_to_(508_bit)_(30.00000sig_in_deg) && exec bash xfade.sh'
please note:
It's being called from inside octave a math program like matlab
Why are you bothering with an outer shell? Quote the argument to cd:
(cd '/tmp/h1/clients/04212015142432811_Fs_1000_ahh/pls/03sox_a_Fs_1000_ahh_(000_bit)_(0.00000sig_in_deg)_to_(508_bit)_(30.00000sig_in_deg)' && exec bash xfade.sh)
If you really must use an extra bash -c...
dirname='/tmp/h1/clients/04212015142432811_Fs_1000_ahh/pls/03sox_a_Fs_1000_ahh_(000_bit)_(0.00000sig_in_deg)_to_(508_bit)_(30.00000sig_in_deg)'
bash -c 'cd "$1" && exec bash xfade.sh' _ "$dirname"
Can you use double quotes for bash -c?
bash -c "cd '/tmp/h1/clients/04212015142432811_Fs_1000_ahh/pls/03sox_a_Fs_1000_ahh_(000_bit)_(0.00000sig_in_deg)_to_(508_bit)_(30.00000sig_in_deg)' && exec bash xfade.sh"
You need to quote the path inside of the command string passed to the bash subshell. E.g.:
bash -c 'cd '"'"'/tmp/h1/clients/04212015142432811_Fs_1000_ahh/pls/03sox_a_Fs_1000_ahh_(000_bit)_(0.00000sig_in_deg)_to_(508_bit)_(30.00000sig_in_deg) && exec bash xfade.sh'"'"''
You get the error because running bash -c passes the argument string to a new shell. The argument string will have the single quotes stripped by the outer (invoking) shell.
UPDATED: to correctly quote single quotes inside single quotes as pointed out by Charles Duffy

Trouble escaping quotes in a variable held string during a Sub-shell execution call [duplicate]

This question already has answers here:
Why does shell ignore quoting characters in arguments passed to it through variables? [duplicate]
(3 answers)
Closed 6 years ago.
I'm trying to write a database call from within a bash script and I'm having problems with a sub-shell stripping my quotes away.
This is the bones of what I am doing.
#---------------------------------------------
#! /bin/bash
export COMMAND='psql ${DB_NAME} -F , -t --no-align -c "${SQL}" -o ${EXPORT_FILE} 2>&1'
PSQL_RETURN=`${COMMAND}`
#---------------------------------------------
If I use an 'echo' to print out the ${COMMAND} variable the output looks fine:
echo ${COMMAND}
screen output:-
#---------------
psql drupal7 -F , -t --no-align -c "SELECT DISTINCT hostname FROM accesslog;" -o /DRUPAL/INTERFACES/EXPORTS/ip_list.dat 2>&1
#---------------
Also if I cut and paste this screen output it executes just fine.
However, when I try to execute the command as a variable within a sub-shell call, it gives an error message.
The error is from the psql client to the effect that the quotes have been removed from around the ${SQL} string.
The error suggests psql is trying to interpret the terms in the sql string as parameters.
So it seems the string and quotes are composed correctly but the quotes around the ${SQL} variable/string are being interpreted by the sub-shell during the execution call from the main script.
I've tried to escape them using various methods: \", \\", \\\", "", \"" '"', \'"\', ... ...
As you can see from my 'try it all' approach I am no expert and it's driving me mad.
Any help would be greatly appreciated.
Charlie101
Instead of storing command in a string var better to use BASH array here:
cmd=(psql ${DB_NAME} -F , -t --no-align -c "${SQL}" -o "${EXPORT_FILE}")
PSQL_RETURN=$( "${cmd[#]}" 2>&1 )
Rather than evaluating the contents of a string, why not use a function?
call_psql() {
# optional, if variables are already defined in global scope
DB_NAME="$1"
SQL="$2"
EXPORT_FILE="$3"
psql "$DB_NAME" -F , -t --no-align -c "$SQL" -o "$EXPORT_FILE" 2>&1
}
then you can just call your function like:
PSQL_RETURN=$(call_psql "$DB_NAME" "$SQL" "$EXPORT_FILE")
It's entirely up to you how elaborate you make the function. You might like to check for the correct number of arguments (using something like (( $# == 3 ))) before calling the psql command.
Alternatively, perhaps you'd prefer just to make it as short as possible:
call_psql() { psql "$1" -F , -t --no-align -c "$2" -o "$3" 2>&1; }
In order to capture the command that is being executed for debugging purposes, you can use set -x in your script. This will the contents of the function including the expanded variables when the function (or any other command) is called. You can switch this behaviour off using set +x, or if you want it on for the whole duration of the script you can change the shebang to #!/bin/bash -x. This saves you explicitly echoing throughout your script to find out what commands are being run; you can just turn on set -x for a section.
A very simple example script using the shebang method:
#!/bin/bash -x
ec() {
echo "$1"
}
var=$(ec 2)
Running this script, either directly after making it executable or calling it with bash -x, gives:
++ ec 2
++ echo 2
+ var=2
Removing the -x from the shebang or the invocation results in the script running silently.

Resources