Nested herodoc order of operations - bash

I have the following code -
ssh host sh -s << EOF
cd /backup/
ls
psql -U user -d db << SQL
$(sed 's/${previous_quarter}/${current_quarter}/' table_quarters.sql);
$(sed 's/${previous_quarter}/${current_quarter}/' plans.sql);
SQL
EOF
This is the order of execution it is following when I execute it -
table_quarters script
plans script
ls command.
Why is it not following this order of execution -
ls command
table_quarters script
plans script

You are sending a string to the standard input of ssh. This string can only be constructed if the stuff in $(...) is run first, because these parts will be replaced by the output of the enclosed commands.
Once the string is expanded, it's send by ssh to the remote machine which runs the resulting commands in order.
If you want to run the expansions on the remote end, you need to properly escape the dollar signs.

Related

Using sed command on remote system

I am using the below command on the local machine and it gives me the expected result:
sed -n 's/^fname\(.*\)".*/\1/p' file.txt
When I use the same command(only changed ' to ") to a same file present in the remote system, I do not get any output.
ssh remote-system "sed -n "s/^fname\(.*\)".*/\1/p" file.txt"
Please help me to get this corrected. Thanks for your help.
" and ' are different things in bash, and they are not interchangeable (they're not interchangeable in many languages, however the differences are more subtle) The single quote means 'pretend everything inside here is a string'. The only thing that will be interpreted is the next single quote.
The double quote allows bash to interpret stuff inside
For example,
echo "$TERM"
and
echo '$TERM'
return different things.
(Untested) you should be able to use single quotes and escape the internal single quotes :
ssh remote-system 'sed -n \'s/^fname(.)"./\1/p\' file.txt'
Looks like you can send a single quote with the sequence '"'"' (from this question)
so :
ssh remote-machine 'sed -n '"'"'s/^fname\(.*\)".*/\1/p'"'"' file.txt'
This runs on my machine if I ssh into localhost, there's no output because file.txt is empty, but it's a proof-of-concept.
Or - can you do the ssh session interactively/with a heredoc?
ssh remote-system
[sed command]
exit
or (again untested, look up heredocs for more info)
ssh remote-system <<-EOF
[sed command]
EOF

Shell: save ssh command result to local variable

I have a ssh command that sends back a number:
ssh ${user}#${hostname} "wc -l < ${workspace}/logs"
where ${user},${hostname},${workspace}are variables.
Now I want to save the result to a local variable called lines, I tried:
lines=${ssh ${user}#${hostname} "wc -l < ${workspace}/logs" }
But it does not work, why?
Your command should be wrapped in "()" not "{}" when assigning fetching the result to a variable. Also, the others are just variables not commands so don't need a wrapper (assuming they are defined in a script or something).
lines=$(ssh $user#$hostname "wc -l < $workspace/logs")
You should use $() to get the result of command, not ${} (in bash at least). The line below assumes that workspace is a variable defined on the host executing the lines=, not on the remote (if it was on the remote you would need to escape the $: \${workspace}/logs
lines=$(ssh ${user}#${hostname} "wc -l < ${workspace}/logs" )

Bash string parsing, skewed by internal strings

I have a string in my bash shell like:
out=$(su - user -c "someCommand -f 'string text "problemString"'")
The problem here is that it's getting parsed as so:
out=\$(su - user -c \"someCommand -f 'string text \"problemString\"'\")
I don't want "problemString" to be parsed out -- i.e., it needs to stay exactly as-is, including the quotes. How can I do that?
Update: I've attempted to escape the inner " with:
out=$(su - user -c "someCommand -f 'string text \"problemString\"'"),
but when the command is executed on the host machine, it returns an error from someCommand:
Unknown command '\p'
Update 2:
Real example:
OUTPUT=$(su - mysql -c "mysql --skip-column-names --raw --host=localhost --port=3306 --user=user--password=pass -e 'show variables where variable_name = \"max_connections\"'")
I'm passing this bash script via fabric in Python:
# probably not relevant, but just in case..
def ParseShellScripts(runPath, commands):
for i in range(len(commands)):
if commands[i].startswith('{shell}'):
# todo: add validation/logging for directory `sh` and that scripts actually exist
with open(os.path.join(runPath, 'sh', commands[i][7:]),"r") as shellFile:
commands[i] = shellFile.read()
print commands[i]
return commands
This prints:
OUTPUT=$(su - mysql -c "mysql --skip-column-names --raw --host=localhost --port=3306 --user=pluto_user --password=pluto_user -e 'show variables where variable_name = \"max_connections\"'")
which then gets executed on some remote box via fabric, which results in ERROR at line 1: Unknown command '\m'.
You can write:
out=$(su - user -c "someCommand -f 'string text \"problemString\"'")
Simply use single quotes. Strings in single quotes don't get parsed or interpreted. For instance:
echo 'a"b'
outputs:
a"b
Because no parsing occurs.
For reference: bash manual on quoting.

Remote ssh command, LF missing on STDOUT

This is my first question here :)
I'm running into an issue while running a remote SSH command. The issue is that the output of the command comes back in a string without LF to delimit the lines. I checked the output of Hive by itself and it outputs the LFs normally, they are missing when I run the command from a remote host. Is there any way to maintain the output of the remote host?
Here's a snippet of the code:
#!/bin/bash
partition="2013-04-02"
query=$(cat <<EOF
set mapred.job.name="Unique IP - $partition";
select
distinct
src
,dst
,service
,proto
,action
from
logs
where
src rlike "^10\."
and src not rlike "^10\.90\."
and src not rlike "^10[0-9]\."
and src not rlike "\.5[2-8]$"
and dst <> ""
and day="$partition";
EOF)
ssh=`ssh user#hadoop "hive -S -e '$query' 2>&1"
echo $ssh > output.tsv
It's got the newline(s) in there, but you're not seeing them because you don't have double-quotes around the parameter expansion in your echo command.
echo "$ssh" >output.tsv
ETA: The $(...) syntax for command substitution is generally much better than the `...` syntax, as it nests and works sanely with nested quotation marks. And while it doesn't matter in this case, it's good to get in the habit of quoting command substitutions as well:
ssh="$(ssh user#hadoop "hive -S -e '$query' 2>&1")"
Note that the double-quotes inside the $(..) don't cause any problems with the outer quotation marks.
Finally, if all you're doing with this value is echoing it into a file, you can skip the variable and just redirect it there in the first place:
ssh user#hadoop "hive -S -e '$query' 2>&1" >output.tsv
or just
ssh user#hadoop "hive -S -e '$query' " >&output.tsv
The backticks around ssh … are converting your newlines to spaces because they are supposed to.
Try
ssh user#hadoop "hive -S -e '$query' 2>&1" > output.tsv

Combining pipes and here documents in the shell for SSH

Okay, so I've recently discovered the magic of here documents for feeding stdin style lines into interactive commands. However, I'm trying to use this with SSH to execute a bunch of commands on a remote server, but I also need to pipe in some actual input, before executing the extra commands, to confound matters further I also need to get some results back ;)
Here's what I'm trying to use:
#!/bin/sh
RESULT=$(find -type f "$PATH" | gzip | ssh "$HOST" <<- 'REMOTE_SYNC'
cat > "/tmp/.temp_file"
# Do something with /tmp/.temp_file
REMOTE_SYNC
Is this actually correct? Part of the problem I'm having as well is that I need to pipe the data to that file in /tmp, but I should really be generating a randomly named temp file, but I'm not sure how I could do that, assign the name to a variable (so I can get back to it) and still send stdin into it.
I may also extract the find | gzip part to a separate command run locally first, as the gzipped file will likely be small enough that sending it when ready will result in a much shorter SSH connection then sending it as it's generated, but it still doesn't get around the fact that I need to be able to provide both stdin and my extra commands to SSH.
No, you can't do it like this. Both heredoc and the piped input compete for stdin, and only one wins. Look at this example:
echo test | cat << EOF
TEST
EOF
What will this print? test, TEST or both? It prints TEST, so the heredoc wins (at least in bash).
You don't really need this anyway. Luckily ssh takes a command argument, which will be passed on to the shell on the remote host, so you can just use your command as a string here. So something like this:
echo TEST | ssh user#host 'cat > tempfile; cat tempfile; rm tempfile'
would work (althoug it doesn't make much sense), the output of the left side commands is piped through ssh to the remote host and supplied as stdin there.
If you want the data to be compressed when sending it through ssh, you can just enable compression using the -C option.
edit:
Using linebreaks inside a string is perfectly fine, so this works fine too:
echo TEST | ssh user#host '
cat > tempfile
cat tempfile
rm tempfile
'
The only difference to a heredoc would be that you have to escape quotes.
If you use something like echo TEST | ssh user#host "$(<script.sh)" you can write everything into a file...

Resources