Passing selections to external commands as arguments rather than piping in kakoune - bash

I am trying to send code to SBCL directly from Kakoune. I've settled on using tmux for this, the SBCL instance runs in a tmux instance with a given session name. The tmux command for passing in the key inputs is as follows:
tmux send-keys -t <session-name> "<text to send to tmux>"
In kakoune, however, it seems that the most convenient existing ways to pass selections' text into an external command is through piping, not as an argument. For now this seems to work:
nop %sh{tmux send-keys -t sess -l "$kak_selection"}
This kind of does what I want, but it only sends the primary selection. I cannot really use $kak_selections because this adds single quotes to the selections, which would not be parsed as intended by SBCL. Even if it didn't, I'd prefer if it behaved more like alt+|, which pipes the selections into their own instances of the command. Is there an existing way in Kakoune to do this? If not, would it be easy to write an sh script that would convert the stdin to a quoted argument for tmux?

I was unable to find any built-in way to do this, but this question/answer helped: Piping result of command as an argument
I ended up settling on this command to be run externally (so that I can use the default piping behavior in Kakoune)
xargs -0 tmux send-keys -t sbcl -l "${#}"

Related

Problem handling enviroment variable when launching terminal from bash script

The following script gets called with an enviroment variable setted.
I need to launch a terminal and inside that terminal read that variable from another script ( script.sh ).
xfce4-terminal -x sh -c \
"export VAR='${VAR}'
/home/usr/scripts/script.sh"
It works but not when VAR has single quotes in it.
I also feel like there is a better way to pass enviroment variable to the terminal but I don't know how.
I really appreciate any kind of help and I'm sorry for my english.
One of the intended features of the environment is that you can add to it, but you never remove things from it. Add VAR to the current environment, and it will be inherited by xfce4-terminal and any process started by that terminal.
export VAR
xfce4-terminal -x sh -c /home/usr/scripts/script.sh
If you don't want it in the current environment, only in the new terminal's, then use a precommend assignment.
VAR="$VAR" xfce4-terminal -x sh -c /home/usr/scripts/script.sh
This avoids any fragile dynamic script construction like you are contending with.
Since xfce4-terminal appears to not fork a new process itself, I would pass the desired value as an argument to sh.
xfce4-terminal -x sh -c 'VAR="$1" /home/usr/scripts/script.sh' _ "$VAR"
The argument to -c is still a fixed string rather than one generated by interpolating the value of $VAR.

SSH - terminal misbehaving when SSH invoked from a `while read` loop

I'm coding a Bash script to automate tasks across multiple servers.
I am logging to a Centos 7 machine over SSH to run some editor (nano, vi, ...)
ssh -tt centos#... '/bb/Conf edit'
The /bb/Conf edit is basically just vi /bb/conf.yaml.
When I run the SSH command from my shell, it works fine. However, when the same SSH command is ran from a Bash script inside a while read ...; do loop, the editor has wrong size (80x40 I guess) and seems to ignore the keys I press - i.e. in nano, Ctrl+x doesn't do anything. The only key that works is Ctrl+c which closes the connection.
I thought this is something related to the TERM variable, as per this, so I tried to add export TERM=xterm or TERM=rxvt to /bb/Conf or the place calling the SSH. The variable is in fact set in the target environment (I've tried echo $TERM right before vi). But the terminal still misbehaved.
Then I have tried to put just that single command ssh ... to a new script. When running that, the editor worked fine.
After a while I found out that it works outside a while read loop, but not inside. I assume that the editors do some stdin/stdout magic and then read somehow breaks that.
Is there a way to run an editor like vi or nano from within a loop?
(The purpose in my case is to allow the users to edit files on multiple servers.)
That's because both read and ssh are reading from the same input stream. The solution is to use a different file descriptor for the while read loop:
while IFS= read -r -u3 line; do
ssh ...
done 3< file
Here, we're using file descriptor 3 instead of stdin.
Lengthy pipelines can be hard to read and maintain, but you can use whitespace constructively: newlines are allowed following | and && and ||. Also, parentheses introduce a subshell which contains an arbitrary script, so indentation helps.
while read -u3 line; do
: do stuff here that needs to read from stdin
done 3< <(
command 1 of the pipeline |
command 2 |
command 3
)
That's clean and readable. The downside is that it puts the last part of the pipeline (the while loop) first, so the code kind of flows backwards.

Bash commands as variables failing when joining to form a single command

ssh="ssh user#host"
dumpstructure="mysqldump --compress --default-character-set=utf8 --no-data --quick -u user -p database"
mysql=$ssh "$dumpstructure"
$mysql | gzip -c9 | cat > db_structure.sql.gz
This is failing on the third line with:
mysqldump --compress --default-character-set=utf8 --no-data --quick -u user -p database: command not found
I've simplified my actualy script for the purpose of debugging this specific error. $ssh and $dumpstructure aren't always being joined together in the real script.
Variables are meant to hold data, not commands. Use a function.
mysql () {
ssh user#host mysqldump --compress --default-character-set=utf8 --nodata --quick -u user -p database
}
mysql | gzip -c9 > db_structure.sql.gz
Arguments to a command can be stored in an array.
# Although mysqldump is the name of a command, it is used here as an
# argument to ssh, indicating the command to run on a remote host
args=(mysqldump --compress --default-character-set=utf8 --nodata --quick -u user -p database)
ssh user#host "${args[#]}" | gzip -c9 > db_structure.sql.gz
Chepner's answer is correct about the best way to do things like this, but the reason you're getting that error is actually even more basic. The line:
mysql=$ssh "$dumpstructure"
doesn't do anything like what you want. Because of the space between $ssh and "$dumpstructure", it'll parse this as environmentvar=value command, which means it should execute the "mysqldump..." part with the environment variable mysql set to ssh user#host. But it's worse than that, since the double-quotes around "$dumpstructure" mean that it won't be split into words, and so the entire string gets treated as the command name (rather than mysqldump being the command name, and the rest being arguments to it).
If this had been the right way to go about building the command, the right way to stick the parts together would be:
mysql="$ssh $dumpstructure"
...so that the whole combined string gets treated as part of the value to assign to mysql. But as I said, you really should use Chepner's approach instead.
Actually, commands in variables should also work and can be in form of `$var` or just $($var). If it says command not found, it could because the command maybe not in you PATH. Or you should give full path of you command.
So let's put this vote down away and talk about this question.
The real problem is mysql=$ssh "$dumpstructure". This means you'll execute $dumpstructure with additional environment mysql=$ssh. So we got command not found exception. It's actually because mysqldump is located on remote server not this host, so it's reasonable this command is not found.
From this point, let's see how to fix this question.
OP want to dumpplicate mysql data from remote server, which means $dumpstructure shoud be executed remotely. Let's see third line mysql=$ssh "$dumpstructure". Now we figure out this would result in problem. So what should be the correct command? The simplest command should be like mysql="$ssh $dumpstructure", which means both $ssh and $dumpstructure will be join into single command line in variable $mysql.
At the end, let's talk about the last command line. I do not agree with variable are meant to hold data, not command. Cause command is also a kind of data. The real problem is how to use it correctly.
OP's command is also supported, at least it is supported on bash 4.2.46.
So the real problem is how to use a variable to hold commands not import a new method to do that, wraping them into a bash function, for example.
So who can tell me why this answer does not come into readers' notice but be voted down?

how to script commands that will be executed on a device connected via ssh?

So, I've established a connection via ssh to a remote machine; and now what I would like to do is to execute few commands, grab some files and copy them back to my host machine.
I am aware that I can run
ssh user#host "command1; command2;....command_n"
and then close the connection, but how can I do the same without use the aforememtioned syntax? I have a lot of complex commands that has a bunch of quote and characters that would be a mess to escape.
Thanks!
My immediate thought is why not create a script and push it over to the remote machine to have it run locally in a text file? If you can't for whatever reason, I fiddled around with this and I think you could probably do well with a HEREDOC:
ssh -t jane#stackoverflow.com bash << 'EOF'
command 1 ...
command 2 ...
command 3 ...
EOF
and it seems to do the right thing. Play with your heredoc to keep your quotes safe, but it will get tricky. The only other thing I can offer (and I totally don't recomend this) is you could use a toy like perl to read and write to the ssh process like so:
open S, "| ssh -i ~/.ssh/host_dsa -t jane#stackoverflow.com bash";
print S "date\n"; # and so on
but this is a really crummy way to go about things. Note that you can do this in other languages.
Instead of the shell use some scripting language (Perl, Python, Ruby, etc.) and some module that takes care of the ugly work. For example:
#!/usr/bin/perl
use Net::OpenSSH;
my $ssh = Net::OpenSSH->new($host, user => $user);
$ssh->system('echo', 'Net::Open$$H', 'Quot%$', 'Th|s', '>For', 'You!');
$ssh->system({stdout_file => '/tmp/ls.out'}, 'ls');
$ssh->scp_put($local_path, $remote_path);
my $out = $ssh->capture("find /etc");
From here: Can I ssh somewhere, run some commands, and then leave myself a prompt?
The use of an expect script seems pretty straightforward... Copied from the above link for convenience, not mine, but I found it very useful.
#!/usr/bin/expect -f
spawn ssh $argv
send "export V=hello\n"
send "export W=world\n"
send "echo \$V \$W\n"
interact
I'm guessing a line like
send "scp -Cpvr someLocalFileOrDirectory you#10.10.10.10/home/you
would get you your files back...
and then:
send "exit"
would terminate the session - or you could end with interact and type in the exit yourself..

pager (less) -- get current scroll position?

I am scripting the display of the output of a script (well, it is just the program git diff) with tmux: Once a filesystem change is detected the shell script executes tmux send-keys q enter C-l "git diff" enter which has it effectively refresh the git diff view.
You might consider this similar to functionality provided by iTerm's coprocesses.
Problem is, I want it on refresh to scroll back to the same position that it was in.
One of the reasons for using tmux is that the window is actually a totally normal and interactive terminal session that can be interacted with as normal to scroll around to look at the full output.
But I want to obtain the scroll position somehow.
Suppose I want to actually do computation on the text content of the terminal window itself, exactly like iTerm2's coprocess does, but so that I can use it on Linux (over ssh). Does tmux provide this ability?
I'm unsure about capturing this with a script, but less -N will show line numbers.
And -jn or --jump-target=n can jump to a location.
About iTerm's coprocesses,
tmux has a command pipe-pane that can be used to pipe the input and output of a shell command to the output and input of a target pane specified by -t.
So if I have a shell program, ~/script.sh for example:
#!/usr/bin/env bash
while read line; do
if [[ "$line" = "are you there?"* ]]; then
echo "echo yes"
fi
done
Note that read line will read every line printed to the pane, i.e. the prompt as well.
I can connect its stdin and stdout to pane 0 in my-session:my-window like so:
tmux pipe-pane -IO -t my-session:my-window.0 "~/script.sh"
and then at the prompt, type echo are you there?, and it responds:
$ echo are you there?
are you there?
$ echo yes
yes
Be careful using -IO together as it can easily cause a feedback loop (I had to kill the tmux server a few times while experimenting with this feature, lol).

Resources