Wildcard that executes command once for each match - bash

Alternate title: How to loop without a loop or xargs.
Recently, I switched to zsh because of its many features. I'm curious: Is there a feature which expands wildcards such that the command is executed once for each match instead of only one time for all matches at once.
Example
The command ebook-convert input_file output_file [options] accepts just one input file. When I want to convert multiple files, I have to execute the command multiple times manually or use a loop, for instance:
for i in *.epub; do
ebook-convert "$i" .mobi
done
What I'd like is a wildcard that functions like the loop so that I can save a few keystrokes. Let said wildcard be ⁂. The command
ebook-convert ⁂.epub .mobi
should expand to
ebook-convert 1stMatch.epub .mobi
ebook-convert 2ndMatch.epub .mobi
ebook-convert 3rdMatch.epub .mobi
...
Still interested in other answers
I accepted an answer that works for me (thanks to Grisha Levit). But if you know other shells with such a feature, alternative commands which are shorter than writing a loop, or even a way to extend zsh with the wanted wildcard your answers are appreciated.

so that I can save a few keystrokes
OK, so let's say you typed out
ebook-convert *.epub .mobi
…and now you realized that this isn't going to work — you need to write a loop. What would you normally do? Probably something like:
add ; done to the end of the line
hit CtrlA to go the beginning of the line
type for i in…
etc…
This looks like a good fit for readline keyboard macro:
Let's write this out the steps in terms of readline commands and regular keypresses:
end-of-line # (start from the end for consistency)
; done # type in the loop closing statement
character-search-backward * # go back to the where the glob is
shell-backward-word # (in case the glob is in the mid-word)
shell-kill-word # "cut" the word with the glob
"$i" # type the loop variable
beginning-of-line # go back to the start of the line
for i in # type the beginning of the loop opening
yank # "paste" the word with the glob
; do # type the end of the loop opening
Creating the binding:
For any readline command used above that does not have a key-binding, we need to create one. We also need to create a binding for the new macro that we are creating.
Unless you've already done a lot of readline customization, running the commands below will set the bindings up for the current shell. This uses default bindings like \C-e ➙ end-of-line.
bind '"\eB": shell-backward-word'
bind '"\eD": shell-kill-word'
bind '"\C-i": "\C-e; done\e\C-]*\eB\eD \"$i\"\C-afor i in\C-y; do "'
The bindings can also go into the inputrc file for persistence.
Using the shortcut:
After setting things up:
Type in something like
ebook-convert *.epub .mobi
Press CtrlI
The line will transform into
for i in *.epub; do ebook-convert "$i" .mobi; done
If you want to run the command right away, you can modify the macro to append a \C-j as the last keypress, which will trigger accept-line (same as hitting Return).

You could checkout zargs in zsh.
This function has a similar purpose to GNU xargs. Instead of reading lines of arguments from the standard input, it takes them from the command line
zshcontrib(1): OTHER FUNCTIONS, zargs
So, we could write:
autoload -Uz zargs
zargs -I⁂ -- *.epub -- ebook-convert ⁂ .mobi
PS: you could find zmv is handy if you need to capture some portions of patterns for building commands.

The for loop has a shortened form that you might like:
for f (*.epub) ebook-convert $f .mobi

You could make yourself a script that does this :
#!/bin/bash
command="$1"
shift
if
[[ $# -lt 3 ]]
then
echo "Usage: command file/blog arg1, arg2..."
exit 1
fi
declare -a files=()
while [ "$1" != "--" ]
do
[ "$1" ] || continue
files+=("$1")
shift
done
if
[ "$1" != "--" ]
then
echo "Separator not found : end file list with --"
exit 1
fi
shift
for file in "${files[#]}"
do
"$command" "$file" "$#"
done
You cal call this like this (assumes the script is called apply_to).
apply_to command /dir/* arg1, arg2...
EDIT
I modified the code to insert filenames at the beginning of the command.

Related

Bash command completion with full path expansion injected into history for vim

i've spent a solid week searching online and trying many different ways to solve a tricky problem. basically i would like to use vim to edit custom commands / scripts that are in my $PATH without having to actually cd to their given directories first or manually type their full paths on the command line.
in essence, i'd love to be able to combine stock bash command completion (compgen -c) with simultaneous path expansion when specifying scripts in my $PATH as vim FILE ARGUMENTS. btw i'm using the caps to make clear what can be a tricky subject and not shouting.
it's probably easier to show you what i'm trying to do then explain it. lets say i have scripts in directories that are on my $PATH
~/bin/x/y/cmd1.sh
~/bin/a/b/cmd2.sh
/ppp/n/m/cmd3.sh
sometimes these scripts provide functionality on files that exist in other directories so i'd like to be able to edit them easily from anywhere in the file system. sometimes i just want to be able to edit those scripts from other directories because it's more convenient. lets say i'm currently in the following directory.
/completely/different/dir
but now i need to vim edit
~/bin/a/b/cmd2.sh
my options to achieve this solely with default bash functionality is to do one of the following which takes a long time
cd ~/bin/a/b/; vim cmd.sh
vim ~/<tab-complete-my-way-to-file>
open a new terminal window plus some combination of the above
since i know the names of my custom scripts it would be soooo much easier to just do the following which requires no tab completion of the full path to the file or directory as well as no cd'ing to a different directory to change my context!!!
vim cmd2.sh
but this won't work by default b/c vim needs the full path to the script
my first thought was to write a vim wrapper function which basically uses which to do the $PATH expansion for me and then tie bash command completion to my vc function like this:
vc () { vim $(which "$#"); }
complete -c vc
i can run the following in the shell to complete partial script names that start with "c" from the choices of cmd1.sh, cmd2.sh, cmd3.sh
vc c<tab>
until i get what i want here which is great
vc cmd2.sh
when i hit enter and execute the command it all works fine BUT it doesn't inject the expanded path into the READLINE command line and thus the FULL EXAPANDED PATH of 'cmd2.sh' never winds up in my command history! my history will show this
vc cmd2.sh
instead of
vc ~/bin/a/b/cmd2.sh
or
vim ~/bin/a/b/cmd2.sh
i want that expanded path in my command history because it makes future operations on that script file super easy when reusing command history. ie i can ls, file, diff, mv, cp that expanded path much easier reusing history than writing more wrapper scripts for ls, file, diff, mv, cp etc.. like i had to do with vc above.
QUESTIONS :
OPTION 1
is there a way to reinject the full expanded path provided by which in my vc function directly back into the original vc READLINE or just inject the entire "vim " command that actually gets executed in vc as a replacement for the original vc command into READLINE? any method that allows me to get the expanded vim command into the history even if it is in addition to the original vc command is ok by me.
basically how do you access and edit the current READLINE programmatically in bash?
OPTION 2
note i can also do something like this DIRECTLY on the command line in real-time
vim $(which cmd2.sh) C-x-e
which gives me what i want (it expands the path which will then put it into history) but i have to always type the extra subshell and which text as well as the C-x-e (to expand the line) on every iteration of the command while losing the command completion functionality which basically makes this useless. put another way, is there anyway to automate the above using a bind key so that
vc cmd2.sh
is automatcially transformed first into
vim $(which cmd2.sh)
and then automatically follows up with C-x-e so that it gets expanded to
vim ~/bin/a/b/cmd2.sh
but have all the editing movement, text insertion and final command line expansion happen all in the same bindkey macro? this might be the best solution of all.
OPTION 3
alternatively, since bash command completion automatically winds up in the READLINE and thus the history, a custom completion function would solve my problem. is there a way to make vc use a completion function that would BOTH complete commands in $PATH when used as vim arguments as described above AND ALSO SIMULTANEOUSLY EXPAND THEM TO THEIR FULL PATHS?
i know how to write a basic completion function. countless hours of attempts (which i am choosing not to put here to keep confusion / post length down) are failing for the simple reason that i'm not sure command completion is compatible with simultaneous full path expansion b/c it breaks traditional completion.
with a custom completion function, here's what happens when i try to find one of my scripts "cmd2.sh" living in "vim ~/bin/a/b/cmd2.sh" but start with a "c" and hit "".
vim c<tab>
instead of getting me these completions to choose from
cmd1.sh
cmd2.sh
cmd3.sh
it completes the first one it finds in the $PATH and inserts it into the READLINE which might be
/ppp/n/m/cmd3.sh
when i really want
~/bin/a/b/cmd2.sh
this effectively kills the completion lookup because the word before my cursor in the READLINE now starts with /ppp/n/m/cmd3.sh and there's no way of getting back to cmd2.sh
i hope that's clear.
thanks
This requires some boilerplate in your .bashrc file, but might work for you. It makes use of the directory stack (some might say it abuses the directory stack, but if you aren't using it for anything else, it might be OK).
In your .bashrc, add each directory of interest to your directory stack. End the list with your home directory, as pushd also changes your current working directory.
pushd ~/bin/x/y/cmd1.sh
pushd ~/bin/a/b/cmd2.sh
pushd /ppp/n/m/cmd3.sh
pushd ~
Yes, it duplicates your PATH entry a bit, but I contend you don't really need access to every directory in your PATH, just the ones where you have files you intend to edit. (Are you really going to try to edit anything in /bin or /usr/bin?)
Now, in your interactive shell, you can run dirs -v to see, along with its index, the directories in your stack:
$ dirs -v
0 ~
1 /ppp/n/m
2 ~/bin/a/b
3 ~/bin/x/y
4 ~
Now, no matter where you are, if you want to edit ~/bin/x/y/cmd1.sh, you can use
$ vi ~3/cmd3.sh
As long as you don't use popd or pushd elsewhere to modify the stack, the indices will stay the same. (Using pushd will add a new directory to the top of the stack, increasing each index; popd will decrease each index after it removes the top directory.)
A much simpler process would be to simply define some variables whose values are the desired directories:
binab=~/bin/a/b
binxy=~/bin/x/y
ppp=/ppp/n/m
and simply expand them
$ vi $ppp/cmd3.sh
The shell performs parameter name completion, so the variable names don't have to be particularly short, but the dirstack approach guarantees you only need 2 or 3 characters. (Also, it doesn't pollute the global namespace with additional varibles.)
Interestingly, I've found myself wanting to do something similar a while back. I hacked together the following bash script. It's pretty self-explanatory. If I want to edit one of my scripts (this one, for example is ~/bin/vm), I just run vm vm. I can open several files in my path, either in buffers, or vertical/horizontal splits etc...
Do with it what you like, pasting it here because it's all ready to use:
#!/usr/bin/env bash
Usage() {
cat <<-__EOF_
${0##*/} Opens scripts in PATH from any location (vim -O)
Example: ${0##*/} ${0##*/}
opens this script in vim editor
-o: Change default behaviour (vim -O) to -o
-b: Change default behaviour to open in buffers (vim file1 file2)
-h: Display this message
__EOF_
}
flag="O"
vimopen() {
local wrapped
local located
local found
found=false
[ $# -lt 1 ] && echo "No script given" && return
wrapped=""
for arg in "$#"; do
if located=$(which "${arg}" 2> /dev/null); then
found=true
wrapped="${wrapped} ${located}"
else
echo "${arg} not found!"
fi
done
$found || return
# We WANT word splitting to occur here
# shellcheck disable=SC2086
case ${flag} in
O)
vim $wrapped -O
;;
o)
vim $wrapped -o
;;
*)
vim $wrapped
esac
}
while getopts :boh f; do
case $f in
h)
Usage
exit 0
;;
o)
flag="o"
shift
;;
b)
flag=""
shift
;;
*)
echo "Unknown option ${f}-${OPTARG}"
Usage
exit 1
;;
esac
done
vimopen "$#"
Let me share something that answers OPTION3 part of your answer:
Behavior of this solution
The solutions that I will show will offer up basenames of commands (i.e. what compgen -c ${cur} returns where cur is last word on the command line) until there is only one candidate in which case it will be replaced by the full path of the command.
$ vc c<TAB><TAB>
Display all 216 possibilities? (y or n)
$ vc cm<TAB>
cmake cmake-gui cmcprompt cmd1.sh cmd2.sh cmd3.sh cmp cmpdylib cmuwmtopbm
$ vc cmd<TAB>
cmd1.sh cmd2.sh cmd3.sh
$ vc cmd1<TAB>
$ vc /Users/pcarphin/vc/bin/cmd1.sh
which I think is what you want.
And for your vc function, you can still do
vc(){
vim "$(which "${1}")
}
since which /Users/pcarphin/vc/bin/cmd3.sh returns /Users/pcarphin/vc/bin/cmd3.sh and so it will work whether you do vc cmd3.sh<ENTER> or if you do vc cmd3.sh<TAB><ENTER>
Basic solution
So here it is, it's as simple as using compgen -c to get command basename candidates and checking if you only have a single candidate and if so, replacing it with the full path.
_vc(){
local cur prev words cword
_init_completion || return;
COMPREPLY=( $(compgen -c ${cur}) )
#
# If there is only one candidate for completion, replace it with the
# full path returned by which.
#
if ((${#COMPREPLY[#]} == 1)) ; then
COMPREPLY[0]=$(which ${COMPREPLY[0]})
fi
}
complete -F _vc vc
Solution that filters out shell functions
The compgen -c command will include the names of shell functions and if you want to leave those out (maybe because your vc function would fail which would be inelegant for an argument supplied by a completion function), here is what you can do:
_vc(){
local cur prev words cword
_init_completion || return;
local candidates=($(compgen -c ${cur}))
#
# Put in COMPREPLY only the command names that are files in PATH
# and leave out shell functions
#
local i=0
for cmd in "${candidates[#]}" ; do
if which $cmd 2>/dev/null ; then
COMPREPLY[i++]=${cmd}
fi
done
#
# If there is only one candidate for completion, replace it with the
# full path returned by which.
#
if ((${#COMPREPLY[#]} == 1)) ; then
COMPREPLY[0]=$(which ${COMPREPLY[0]})
fi
}
Solution that handles shell functions
If we want to handle shell functions, then we can get rid of the part that filters them out and enhance the part that replaces the command name by a full path when COMPREPLY contains only one candidate. This is based on turning on extdebug which causes declare -F shell_function to output the file where shell_function was defined:
cmd_location(){
local location
if location=$(which "${1}" 2>/dev/null) ; then
echo "${location}"
else
# If extdebug is off, remember that and turn it on
local user_has_extdebug
if ! shopt extdebug ; then
user_has_extdebug=no
shopt -s extdebug
fi
info=$(declare -F ${COMPREPLY[0]})
if [[ -n "${info}" ]] ; then
echo ${info} | cut -d ' ' -f 3
fi
# Turn extdebug back off if it was off before
if [[ "${user_has_extdebug}" == no ]] ; then
shopt -u extdebug
fi
fi
}
_vc(){
local cur prev words cword
_init_completion || return;
COMPREPLY=( $(compgen -c ${cur}) )
if ((${#COMPREPLY[#]} == 1)) ; then
COMPREPLY[0]=$(cmd_location ${COMPREPLY[0]})
fi
}
And in this case, your vc function would need the same kind of logic or you could just remember to always use the shell completion to end up calling it with a full path.
That's why I factored out the cmd_location function
vc(){
if [[ "${1}" == /* ]] ; then
vim "${1}"
else
vim $(cmd_location "${1}")
fi
}
I was looking for something else but I found this question which inspired me to do this for myself so thank you, now I'll have a neat vc function with a cool completion function. Personally, I'm going to use the last version which handles shell functions.
The declare -F command with extdebug prints out the function name, the line number, and the file, so I'll see if I can adapt the solution so that in the case of shell functions, it opens the file at the location.
For that, I'd have to get rid of the part that puts a full path on the command line. So what I'm going to do for myself won't be an answer to your question. Note the use of parentheses for open_shell_function which makes it run in a subshell so I don't have to do the whole thing with user_has_extdebug.
open_shell_function()(
# Use subshell so as not to turn on extdebug in the user's shell
# and avoid doing this remembering stuff
shopt -s extdebug
local info=$(declare -F ${1})
if [[ -z "${info}" ]] ; then
echo "No info from 'declare -F' for '${1}'"
return 1
fi
local lineno
if ! lineno=$(echo ${info} | cut -d ' ' -f 2) ; then
echo "Error getting line number from info '${info}' on '${1}'"
return 1
fi
local file
if ! file=$(echo ${info} | cut -d ' ' -f 3) ; then
echo "Error getting filename from info '${info}' on '${1}'"
return 1
fi
vim ${file} +${lineno}
)
vc(){
local file
if file=$(which ${1} 2>/dev/null) ; then
vim ${file}
else
echo "no '${1}' found in path, looking for shell function"
open_shell_function "${1}"
fi
}
complete -c vc

How to open a file in a specific application from FZF

I'd like to use FZF to search for files and then have them open in an editor of my choice e.g. Sublime, Atom. I'm not sure how to configure my shell for this, I've tried the below but I can't get it to work.
Can you help?
Thanks!
fe() {
local files
IFS=$'\n' files=($(fzf-tmux --query="$1" --multi --select-1 --exit-0))
[[ -n "$files" ]] && ${EDITOR:-atom} "${files[#]}"
}
Based on your comments, it is possible the only problem comes from this part :
${EDITOR:-atom}
This expands to the content of variable EDITOR if has a non-null value, and to atom if it is null or unset. It is likely you have that variable initialized to something else than atom. Try using simply atom instead, like this:
fe() {
local files
IFS=$'\n' files=($(fzf-tmux --query="$1" --multi --select-1 --exit-0))
[[ -n "$files" ]] && atom "${files[#]}"
}
Of course, you can also keep the function as it already is, but make sure your environment contains something like EDITOR=atom.
I wrote a function that I keep in my .bashrc which you can use to select any files through fzf and have them passed to whatever program you want (so not only sublime, but any GUI program you add to the function's list) and it also works with command line tools like cd, cat, tail, head and so on. Also you can cycle back through your history and find the command as it was expanded after fzf did its thing. If you configure fzf to look in many common places on the file system by default (or see here) this function really shines. I use it many times every day, mainly to change directory (f cd) or open files.
In your case you would just type in the terminal:
f sublime
and fzf would launch, after you select your file(s) sublime would open them.
I put the function below, and I got inspiration for it here
#!/bin/bash
# Run command/application and choose paths/files with fzf.
# Always return control of the terminal to user (e.g. when opening GUIs).
# The full command that was used will appear in your history just like any
# other (N.B. to achieve this I write the shell's active history to
# ~/.bash_history)
#
# Usage:
# f cd [OPTION]... (hit enter, choose path)
# f cat [OPTION]... (hit enter, choose files)
# f vim [OPTION]... (hit enter, choose files)
# f vlc [OPTION]... (hit enter, choose files)
f() {
# if no arguments passed, just lauch fzf
if [ $# -eq 0 ]
then
fzf | sort
return 0
fi
# store the program
program="$1"
# remove first argument off the list
shift
# store any option flags
options="$#"
# store the arguments from fzf
arguments=$(fzf --multi)
# if no arguments passed (e.g. if Esc pressed), return to terminal
if [ -z "${arguments}" ]; then
return 1
fi
# sanitise the command:
# put an extra single quote next to any pre-existing single quotes
# put single quotes around each argument
# put them all on one line.
for arg in "${arguments[#]}"; do
arguments=$(echo "$arg" | sed "s/'/''/g;
s/.*/'&'/g;
s/\n//g"
)
done
# if the program is on the GUI list, add a '&'
if [[ "$program" =~ ^(nautilus|zathura|evince|vlc|eog|kolourpaint)$ ]]; then
arguments="$arguments &"
fi
# write the shell's active history to ~/.bash_history.
history -w
# add the command with the sanitised arguments to .bash_history
echo $program $options $arguments >> ~/.bash_history
# reload the ~/.bash_history into the shell's active history
history -r
# execute the last command in history
fc -s -1
}
For those seeking for a general way to open results in MacOS:
Define this alias to open file by os default app (based on selected file type):
alias f='open "$(fzf)"'
Then type f command, find your file and hit ENTER.

Passing more than one argument through to a command in a shell wrapper

I am trying to write a custom command to copy files into a specific directory.
I am not sure the best way to do this. Right now, the script is this
#!/bin/sh
cp -rf $1 /support/save/
I called this command filesave, it works great for 1 file, but if you do *.sh or something similar it only copies the first file. This makes sense as that is the point of $1. Is there an input variable that will just collect all inputs not just the specific one?
#!/bin/sh
cp -rf -- "$#" /support/save
Use "$#" to expand to your entire argument list. It is essential that this be placed in double-quotes, or else it will behave identically to $* (which is to say, incorrectly).
The -- is a widely implemented extension which ensures that all following arguments are treated as literal arguments rather than parsed as options, thus making filenames starting with - safe.
To demonstrate the difference, name the following script quotdemo.
#!/bin/sh
printf '$#: '; printf '<%s>\n' "$#"
printf '$*: '; printf '[%s]\n' $*
...and try running:
touch foo.txt bar.txt "file with spaces.txt" # create some matching files
quotdemo *.txt # ...then test this...
quotdome "*.txt" # ...and this too!

How to customise bash completion to pick only a custom set of commands?

Is there a way to control bash completion to pick only a few commands instead of everything in the path, aliases and functions? We can set a default handler for empty command line but when the first letter is typed, bash goes for completing it with PATH, aliases and functions. Is there a way to customise the completion for the command search?
Example:
$m[tab]
mycmd1 mycmd2 mycmd3
instead of the commands that match in PATH, aliases and functions.
The following should remove the "word" from the completion list. Completion functions just return a bash array and you can manipulate it to contain whatever you like:
_b() {
local word=${COMP_WORDS[COMP_CWORD]}
COMPREPLY=($(compgen -f -- "${word}"))
if [[ "$word" ]]; then
local w
local i=0
local n=${#COMPREPLY[*]}
while [[ $i -lt $n ]]
do
w=${COMPREPLY[$i]}
COMPREPLY[$i]="${w:${#word}}"
let i++
done
fi
}
I don't see any way to get a completion to work for all commands but I suppose you could always do something like this if you really need to:
for c in /bin/* /usr/bin/* ~/bin/*
do
complete -F _b $(basename $c)
done
Now, you can tweak the above code sections to get what you are trying to find (i.e. only some commands). Hint.

Programmatically dereference/resolve aliases in bash

I need to determine which command a shell alias resolves to in bash, programmatically; i.e., I need to write a bash function that will take a name potentially referring to an alias and return the "real" command it ultimately refers to, recursing through chains of aliases where applicable.
For example, given the following aliases:
alias dir='list -l'
alias list='ls'
where my function is dereference_alias,
dereference_alias list # returns "ls"
dereference_alias dir # also returns "ls"
Is there some builtin I don't know about that does this neatly, or shall I resign myself to scraping the output of alias?
Here's a version I wrote that does not rely on any external commands and also handles recursive aliases without creating an infinite loop:
# Arguments:
#
# $1 Command to compact using aliases
#
function command-to-alias()
{
local alias_key
local expansion
local guess
local command="$1"
local search_again="x"
local shortest_guess="$command"
while [[ "${search_again:-}" ]]; do
unset search_again
for alias_key in "${!BASH_ALIASES[#]}"; do
expansion="${BASH_ALIASES[$alias_key]}"
guess="${command/#"$expansion"/$alias_key}"
test "${#guess}" -lt "${#shortest_guess}" || continue
shortest_guess="$guess"
search_again="x"
done
command="$shortest_guess"
done
echo "$command"
}
Here's how I'm doing it, though I'm not sure it's the best way:
dereference_alias () {
# recursively expand alias, dropping arguments
# output == input if no alias matches
local p
local a="$1"
if [[ "alias" -eq $(type -t $a) ]] && p=$(alias "$a" 2>&-); then
dereference_alias $(sed -re "s/alias "$a"='(\S+).*'$/\1/" <<< "$p")
else
echo $a
fi
}
The major downsides here are that I rely on sed, and my means of dropping any arguments in the alias stops at the first space, expecting that no alias shall ever point to a program which, for some reason, has spaces in its name (i.e. alias badprogram='A\ Very\ Bad\ Program --some-argument'), which is a reasonable enough assumption, but still. I think that at least the whole sed part could be replaced by maybe something leveraging bash's own parsing/splitting/tokenization of command lines, but I wouldn't know where to begin.

Resources