Hidden features of Bash - bash

Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
Shell scripts are often used as glue, for automation and simple one-off tasks. What are some of your favorite "hidden" features of the Bash shell/scripting language?
One feature per answer
Give an example and short description of the feature, not just a link to documentation
Label the feature using bold title as the first line
See also:
Hidden features of C
Hidden features of C#
Hidden features of C++
Hidden features of Delphi
Hidden features of Python
Hidden features of Java
Hidden features of JavaScript
Hidden features of Ruby
Hidden features of PHP
Hidden features of Perl
Hidden features of VB.Net

insert preceding line's final parameter
alt-. the most useful key combination ever, try it and see, for some reason no one knows about this one.
press it again and again to select older last parameters.
great when you want to do something else to something you used just a moment ago.

If you want to keep a process running after you log out:
disown -h <pid>
is a useful bash built-in. Unlike nohup, you can run disown on an already-running process.
First, stop your job with control-Z, get the pid from ps (or use echo $!), use bg to send it to the background, then use disown with the -h flag.
Don't forget to background your job or it will be killed when you logout.

Almost everything listed under EXPANSION section in the manual
In particular, parameter expansion:
$ I=foobar
$ echo ${I/oo/aa} #replacement
faabar
$ echo ${I:1:2} #substring
oo
$ echo ${I%bar} #trailing substitution
foo
$ echo ${I#foo} #leading substitution
bar

My favorite:
sudo !!
Rerun the previous command with sudo.

More magic key combinations:
Ctrl + r begins a “reverse incremental search” through your command history. As you continue to type, it retrieves the most recent command that contains all the text you enter.
Tab completes the word you've typed so far if it's unambiguous.
Tab Tab lists all completions for the word you've typed so far.
Alt + * inserts all possible completions, which is particularly helpful, say, if you've just entered a potentially destructive command with wildcards:
rm -r source/d*.c Alt + *
rm -r source/delete_me.c source/do_not_delete_me.c
Ctrl + Alt + e performs alias, history, and shell expansion on the current line. In other words, the current line is redisplayed as it will be processed by the shell:
ls $HOME/tmp Ctrl Alt + e
ls -N --color=tty -T 0 /home/cramey

Get back history commands and arguments
It's possible to selectively access previous commands and arguments using the ! operator. It's very useful when you are working with long paths.
You can check your last commands with history.
You can use previous commands with !<n> being n the index of the command in history, negative numbers count backwards from the last command in history.
ls -l foo bar
touch foo bar
!-2
You can use previous arguments with !:<n>, zero is the command, >= 1 are the arguments.
ls -l foo
touch !:2
cp !:1 bar
And you can combine both with !<n>:<m>
touch foo bar
ls -l !:1 !:2
rm !-2:1 !-2:2
!-2
You can also use argument ranges !<n>:<x>-<y>
touch boo far
ls -l !:1-2
Other ! special modifiers are:
* for all the arguments
ls -l foo bar
ls !*
^ for the first argument (!:1 == !^)
$ for the last argument
ls -l foo bar
cat !$ > /dev/null

I like the -x feature, allowing to see what's going on in your script.
bash -x script.sh

SECONDS=0; sleep 5 ; echo "that took approximately $SECONDS seconds"
SECONDS
Each time this parameter is
referenced, the number of seconds
since shell invocation is returned.
If a value is assigned to SECONDS,
the value returned upon subsequent
references is the number of seconds
since the assignment plus the value
assigned. If SECONDS is unset, it
loses its special properties, even if
it is subsequently reset.

Here is one of my favorites. This sets tab completion to not be case sensitive. It's really great for quickly typing directory paths, especially on a Mac where the file system is not case sensitive by default. I put this in .inputrc in my home folder.
set completion-ignore-case on

The special variable random:
if [[ $(($RANDOM % 6)) = 0 ]]
then echo "BANG"
else
echo "Try again"
fi

Regular expression handling
Recent bash releases feature regular expression matching, so you can do:
if [[ "mystring" =~ REGEX ]] ; then
echo match
fi
where REGEX is a raw regular expression in the format described by man re_format.
Matches from any bracketed parts are stored in the BASH_REMATCH array, starting at element 1 (element 0 is the matched string in its entirety), so you can use this to do regex-powered parsing too.

Ctrlx Ctrle
This will load the current command into the editor defined in the variable VISUAL. This is really useful for long commands like some of those listed here.
To use vi as your editor:
export VISUAL=vi

Quick & Dirty correction of typos (especially useful for long commands over slow connections where using the command history and scrolling through it would be horrible):
$ cat /proc/cupinfo
cat: /proc/cupinfo: No such file or directory
$ ^cup^cpu
Also try !:s/old/new which substitutes old with new in the previous command once.
If you want to substitute many occurrences you can do a global substitution with !:gs/old/new.
You can use the gs and s commands with any history event, e.g.
!-2:s/old/new
To substitute old with new (once) in the second to last command.

Here two of my favorites:
To check the syntax w/o really executing the script use:
bash -n script.sh
Go back to the last directory (yes I know pushd and popd, but this is quicker)
cd -

Using Infix Boolean Operators
Consider the simple if:
if [ 2 -lt 3 ]
then echo "Numbers are still good!"
fi
That -lt looks kinda ugly. Not very modern. If you use double brackets around your boolean expression you can the normal boolean operators!
if [[ 2 < 3 ]]
then echo "Numbers are still good!"
fi

Arrays:
#!/bin/bash
array[0]="a string"
array[1]="a string with spaces and \"quotation\" marks in it"
array[2]="a string with spaces, \"quotation marks\" and (parenthesis) in it"
echo "There are ${#array[*]} elements in the array."
for n in "${array[#]}"; do
echo "element = >>${n}<<"
done
More details on arrays (and other advanced bash scripting stuff) can be found in the Advanced Bash-Scripting Guide.

Running a command before displaying the bash prompt
Set a command in the "PROMPT_COMMAND" env variable and it will be run automatically before each prompt.
Example:
[lsc#home]$ export PROMPT_COMMAND="date"
Fri Jun 5 15:19:18 BST 2009
[lsc#home]$ ls
file_a file_b file_c
Fri Jun 5 15:19:19 BST 2009
[lsc#home]$ ls
For the next april fools, add "export PROMPT_COMMAND=cd" to someone's .bashrc then sit back and watch the confusion unfold.

Magic key combinations from the bash man pages:
Ctrl + a and Ctrl + e move the cursor to the beginning and end of the current line, respectively.
Ctrl + t and Alt + t transpose the character and word before the cursor with the current one, then move the cursor forward.
Alt + u and Alt + l convert the current word (from the cursor to the end) to uppercase and lowercase.
Hint: Press Alt + – followed by either of these commands to convert the beginning of the current word.
Bonus man tips:
While viewing man pages, use / to search for text within the pages. Use n to jump ahead to the next match or N for the previous match.
Speed your search for a particular command or sub-section within the man pages by taking advantage of their formatting:
o Instead of typing /history expansion to find that section, try /^history, using the caret (^) to find only lines that begin with "history."
o Try / read, with a few leading spaces, to search for that builtin command. Builtins are always indented in the man pages.

export TMOUT=$((15*60))
Terminate bash after 15 minutes of idle time, set to 0 to disable. I usually put this to ~/.bashrc on my root accounts. It's handy when administrating your boxes and you may forget to logout before walking away from the terminal.

Undo
C-S-- Control Shift Minus Undo-es typing actions.
Kill / Yank
Any delete operation C-w (delete previous word), C-k (delete to end of line), C-u (delete to start of line) etc... copies it's deleted text to the kill ring, you can paste the last kill with: C-y and cycle through (and paste from) the ring of deleted items with Alt-y

You can ignore certain files while tab completing by setting th FIGNORE variable.
For example, if you have a subverion repo and you want to navigate more easily do
export FIGNORE=".svn"
now you can cd without being blocked by .svn directories.

Using arithmetic:
if [[ $((2+1)) = $((1+2)) ]]
then echo "still ok"
fi

Brace expansion
Standard expansion with {x,y,z}:
$ echo foo{bar,baz,blam}
foobar foobaz fooblam
$ cp program.py{,.bak} # very useful with cp and mv
Sequence expansion with {x..y}:
$ echo {a..z}
a b c d e f g h i j k l m n o p q r s t u v w x y z
$ echo {a..f}{0..3}
a0 a1 a2 a3 b0 b1 b2 b3 c0 c1 c2 c3 d0 d1 d2 d3 e0 e1 e2 e3 f0 f1 f2 f3

I recently read Csh Programming Considered Harmful which contained this astounding gem:
Consider the pipeline:
A | B | C
You want to know the status of C, well, that's easy: it's in $?, or
$status in csh. But if you want it from A, you're out of luck -- if
you're in the csh, that is. In the Bourne shell, you can get it, although
doing so is a bit tricky.
Here's something I had to do where I ran dd's
stderr into a grep -v pipe to get rid of the records in/out noise, but had
to return the dd's exit status, not the grep's:
device=/dev/rmt8
dd_noise='^[0-9]+\+[0-9]+ records (in|out)$'
exec 3>&1
status=`((dd if=$device ibs=64k 2>&1 1>&3 3>&- 4>&-; echo $? >&4) |
egrep -v "$dd_noise" 1>&2 3>&- 4>&-) 4>&1`
exit $status;

Truncate content of a file (zeroing file)
> file
Specifically, this is very good for truncating log files, when the file is open by another process, which still may write to the file.

Not really a feature but rather a direction: I found many "hidden features", secrets and various bash usefulness at commandlinefu.com. Many of the highest rated answers to this answers, I learned them on that site :)

Another small one:
Alt+#
comments out the current line and moves it into the history buffer.
So when you're assembling a command line and you need to issue an interim command to e.g. find a file, you just hit alt+#, issue the other command, go up in the history, uncomment and proceed.

Braces in lieu of do and done in for loop
For loop body are usually in do...done (just an example):
for f in *;
do
ls "$f";
done
But we can use a C style using braces:
for f in *; {
ls "$f";
}
I think this looks better than do...doneand I prefer this one. I have not yet found this in any Bash documentation, so this is really a hidden feature.

C style numeric expressions:
let x="RANDOM%2**8"
echo -n "$x = 0b"
for ((i=8; i>=0; i--)); do
let n="2**i"
if (( (x&n) == n )); then echo -n "1"
else echo -n "0"
fi
done
echo ""

These properties are another one of my favorites.
export HISTCONTROL=erasedups
export HISTSIZE=1000
The first one makes sure bash doesn't log commands more than once, will really improves history's usefulness. The other expands the history size to 1000 from the default of 100. I actually set this to 10000 on my machines.

Related

Bash command completion with full path expansion injected into history for vim

i've spent a solid week searching online and trying many different ways to solve a tricky problem. basically i would like to use vim to edit custom commands / scripts that are in my $PATH without having to actually cd to their given directories first or manually type their full paths on the command line.
in essence, i'd love to be able to combine stock bash command completion (compgen -c) with simultaneous path expansion when specifying scripts in my $PATH as vim FILE ARGUMENTS. btw i'm using the caps to make clear what can be a tricky subject and not shouting.
it's probably easier to show you what i'm trying to do then explain it. lets say i have scripts in directories that are on my $PATH
~/bin/x/y/cmd1.sh
~/bin/a/b/cmd2.sh
/ppp/n/m/cmd3.sh
sometimes these scripts provide functionality on files that exist in other directories so i'd like to be able to edit them easily from anywhere in the file system. sometimes i just want to be able to edit those scripts from other directories because it's more convenient. lets say i'm currently in the following directory.
/completely/different/dir
but now i need to vim edit
~/bin/a/b/cmd2.sh
my options to achieve this solely with default bash functionality is to do one of the following which takes a long time
cd ~/bin/a/b/; vim cmd.sh
vim ~/<tab-complete-my-way-to-file>
open a new terminal window plus some combination of the above
since i know the names of my custom scripts it would be soooo much easier to just do the following which requires no tab completion of the full path to the file or directory as well as no cd'ing to a different directory to change my context!!!
vim cmd2.sh
but this won't work by default b/c vim needs the full path to the script
my first thought was to write a vim wrapper function which basically uses which to do the $PATH expansion for me and then tie bash command completion to my vc function like this:
vc () { vim $(which "$#"); }
complete -c vc
i can run the following in the shell to complete partial script names that start with "c" from the choices of cmd1.sh, cmd2.sh, cmd3.sh
vc c<tab>
until i get what i want here which is great
vc cmd2.sh
when i hit enter and execute the command it all works fine BUT it doesn't inject the expanded path into the READLINE command line and thus the FULL EXAPANDED PATH of 'cmd2.sh' never winds up in my command history! my history will show this
vc cmd2.sh
instead of
vc ~/bin/a/b/cmd2.sh
or
vim ~/bin/a/b/cmd2.sh
i want that expanded path in my command history because it makes future operations on that script file super easy when reusing command history. ie i can ls, file, diff, mv, cp that expanded path much easier reusing history than writing more wrapper scripts for ls, file, diff, mv, cp etc.. like i had to do with vc above.
QUESTIONS :
OPTION 1
is there a way to reinject the full expanded path provided by which in my vc function directly back into the original vc READLINE or just inject the entire "vim " command that actually gets executed in vc as a replacement for the original vc command into READLINE? any method that allows me to get the expanded vim command into the history even if it is in addition to the original vc command is ok by me.
basically how do you access and edit the current READLINE programmatically in bash?
OPTION 2
note i can also do something like this DIRECTLY on the command line in real-time
vim $(which cmd2.sh) C-x-e
which gives me what i want (it expands the path which will then put it into history) but i have to always type the extra subshell and which text as well as the C-x-e (to expand the line) on every iteration of the command while losing the command completion functionality which basically makes this useless. put another way, is there anyway to automate the above using a bind key so that
vc cmd2.sh
is automatcially transformed first into
vim $(which cmd2.sh)
and then automatically follows up with C-x-e so that it gets expanded to
vim ~/bin/a/b/cmd2.sh
but have all the editing movement, text insertion and final command line expansion happen all in the same bindkey macro? this might be the best solution of all.
OPTION 3
alternatively, since bash command completion automatically winds up in the READLINE and thus the history, a custom completion function would solve my problem. is there a way to make vc use a completion function that would BOTH complete commands in $PATH when used as vim arguments as described above AND ALSO SIMULTANEOUSLY EXPAND THEM TO THEIR FULL PATHS?
i know how to write a basic completion function. countless hours of attempts (which i am choosing not to put here to keep confusion / post length down) are failing for the simple reason that i'm not sure command completion is compatible with simultaneous full path expansion b/c it breaks traditional completion.
with a custom completion function, here's what happens when i try to find one of my scripts "cmd2.sh" living in "vim ~/bin/a/b/cmd2.sh" but start with a "c" and hit "".
vim c<tab>
instead of getting me these completions to choose from
cmd1.sh
cmd2.sh
cmd3.sh
it completes the first one it finds in the $PATH and inserts it into the READLINE which might be
/ppp/n/m/cmd3.sh
when i really want
~/bin/a/b/cmd2.sh
this effectively kills the completion lookup because the word before my cursor in the READLINE now starts with /ppp/n/m/cmd3.sh and there's no way of getting back to cmd2.sh
i hope that's clear.
thanks
This requires some boilerplate in your .bashrc file, but might work for you. It makes use of the directory stack (some might say it abuses the directory stack, but if you aren't using it for anything else, it might be OK).
In your .bashrc, add each directory of interest to your directory stack. End the list with your home directory, as pushd also changes your current working directory.
pushd ~/bin/x/y/cmd1.sh
pushd ~/bin/a/b/cmd2.sh
pushd /ppp/n/m/cmd3.sh
pushd ~
Yes, it duplicates your PATH entry a bit, but I contend you don't really need access to every directory in your PATH, just the ones where you have files you intend to edit. (Are you really going to try to edit anything in /bin or /usr/bin?)
Now, in your interactive shell, you can run dirs -v to see, along with its index, the directories in your stack:
$ dirs -v
0 ~
1 /ppp/n/m
2 ~/bin/a/b
3 ~/bin/x/y
4 ~
Now, no matter where you are, if you want to edit ~/bin/x/y/cmd1.sh, you can use
$ vi ~3/cmd3.sh
As long as you don't use popd or pushd elsewhere to modify the stack, the indices will stay the same. (Using pushd will add a new directory to the top of the stack, increasing each index; popd will decrease each index after it removes the top directory.)
A much simpler process would be to simply define some variables whose values are the desired directories:
binab=~/bin/a/b
binxy=~/bin/x/y
ppp=/ppp/n/m
and simply expand them
$ vi $ppp/cmd3.sh
The shell performs parameter name completion, so the variable names don't have to be particularly short, but the dirstack approach guarantees you only need 2 or 3 characters. (Also, it doesn't pollute the global namespace with additional varibles.)
Interestingly, I've found myself wanting to do something similar a while back. I hacked together the following bash script. It's pretty self-explanatory. If I want to edit one of my scripts (this one, for example is ~/bin/vm), I just run vm vm. I can open several files in my path, either in buffers, or vertical/horizontal splits etc...
Do with it what you like, pasting it here because it's all ready to use:
#!/usr/bin/env bash
Usage() {
cat <<-__EOF_
${0##*/} Opens scripts in PATH from any location (vim -O)
Example: ${0##*/} ${0##*/}
opens this script in vim editor
-o: Change default behaviour (vim -O) to -o
-b: Change default behaviour to open in buffers (vim file1 file2)
-h: Display this message
__EOF_
}
flag="O"
vimopen() {
local wrapped
local located
local found
found=false
[ $# -lt 1 ] && echo "No script given" && return
wrapped=""
for arg in "$#"; do
if located=$(which "${arg}" 2> /dev/null); then
found=true
wrapped="${wrapped} ${located}"
else
echo "${arg} not found!"
fi
done
$found || return
# We WANT word splitting to occur here
# shellcheck disable=SC2086
case ${flag} in
O)
vim $wrapped -O
;;
o)
vim $wrapped -o
;;
*)
vim $wrapped
esac
}
while getopts :boh f; do
case $f in
h)
Usage
exit 0
;;
o)
flag="o"
shift
;;
b)
flag=""
shift
;;
*)
echo "Unknown option ${f}-${OPTARG}"
Usage
exit 1
;;
esac
done
vimopen "$#"
Let me share something that answers OPTION3 part of your answer:
Behavior of this solution
The solutions that I will show will offer up basenames of commands (i.e. what compgen -c ${cur} returns where cur is last word on the command line) until there is only one candidate in which case it will be replaced by the full path of the command.
$ vc c<TAB><TAB>
Display all 216 possibilities? (y or n)
$ vc cm<TAB>
cmake cmake-gui cmcprompt cmd1.sh cmd2.sh cmd3.sh cmp cmpdylib cmuwmtopbm
$ vc cmd<TAB>
cmd1.sh cmd2.sh cmd3.sh
$ vc cmd1<TAB>
$ vc /Users/pcarphin/vc/bin/cmd1.sh
which I think is what you want.
And for your vc function, you can still do
vc(){
vim "$(which "${1}")
}
since which /Users/pcarphin/vc/bin/cmd3.sh returns /Users/pcarphin/vc/bin/cmd3.sh and so it will work whether you do vc cmd3.sh<ENTER> or if you do vc cmd3.sh<TAB><ENTER>
Basic solution
So here it is, it's as simple as using compgen -c to get command basename candidates and checking if you only have a single candidate and if so, replacing it with the full path.
_vc(){
local cur prev words cword
_init_completion || return;
COMPREPLY=( $(compgen -c ${cur}) )
#
# If there is only one candidate for completion, replace it with the
# full path returned by which.
#
if ((${#COMPREPLY[#]} == 1)) ; then
COMPREPLY[0]=$(which ${COMPREPLY[0]})
fi
}
complete -F _vc vc
Solution that filters out shell functions
The compgen -c command will include the names of shell functions and if you want to leave those out (maybe because your vc function would fail which would be inelegant for an argument supplied by a completion function), here is what you can do:
_vc(){
local cur prev words cword
_init_completion || return;
local candidates=($(compgen -c ${cur}))
#
# Put in COMPREPLY only the command names that are files in PATH
# and leave out shell functions
#
local i=0
for cmd in "${candidates[#]}" ; do
if which $cmd 2>/dev/null ; then
COMPREPLY[i++]=${cmd}
fi
done
#
# If there is only one candidate for completion, replace it with the
# full path returned by which.
#
if ((${#COMPREPLY[#]} == 1)) ; then
COMPREPLY[0]=$(which ${COMPREPLY[0]})
fi
}
Solution that handles shell functions
If we want to handle shell functions, then we can get rid of the part that filters them out and enhance the part that replaces the command name by a full path when COMPREPLY contains only one candidate. This is based on turning on extdebug which causes declare -F shell_function to output the file where shell_function was defined:
cmd_location(){
local location
if location=$(which "${1}" 2>/dev/null) ; then
echo "${location}"
else
# If extdebug is off, remember that and turn it on
local user_has_extdebug
if ! shopt extdebug ; then
user_has_extdebug=no
shopt -s extdebug
fi
info=$(declare -F ${COMPREPLY[0]})
if [[ -n "${info}" ]] ; then
echo ${info} | cut -d ' ' -f 3
fi
# Turn extdebug back off if it was off before
if [[ "${user_has_extdebug}" == no ]] ; then
shopt -u extdebug
fi
fi
}
_vc(){
local cur prev words cword
_init_completion || return;
COMPREPLY=( $(compgen -c ${cur}) )
if ((${#COMPREPLY[#]} == 1)) ; then
COMPREPLY[0]=$(cmd_location ${COMPREPLY[0]})
fi
}
And in this case, your vc function would need the same kind of logic or you could just remember to always use the shell completion to end up calling it with a full path.
That's why I factored out the cmd_location function
vc(){
if [[ "${1}" == /* ]] ; then
vim "${1}"
else
vim $(cmd_location "${1}")
fi
}
I was looking for something else but I found this question which inspired me to do this for myself so thank you, now I'll have a neat vc function with a cool completion function. Personally, I'm going to use the last version which handles shell functions.
The declare -F command with extdebug prints out the function name, the line number, and the file, so I'll see if I can adapt the solution so that in the case of shell functions, it opens the file at the location.
For that, I'd have to get rid of the part that puts a full path on the command line. So what I'm going to do for myself won't be an answer to your question. Note the use of parentheses for open_shell_function which makes it run in a subshell so I don't have to do the whole thing with user_has_extdebug.
open_shell_function()(
# Use subshell so as not to turn on extdebug in the user's shell
# and avoid doing this remembering stuff
shopt -s extdebug
local info=$(declare -F ${1})
if [[ -z "${info}" ]] ; then
echo "No info from 'declare -F' for '${1}'"
return 1
fi
local lineno
if ! lineno=$(echo ${info} | cut -d ' ' -f 2) ; then
echo "Error getting line number from info '${info}' on '${1}'"
return 1
fi
local file
if ! file=$(echo ${info} | cut -d ' ' -f 3) ; then
echo "Error getting filename from info '${info}' on '${1}'"
return 1
fi
vim ${file} +${lineno}
)
vc(){
local file
if file=$(which ${1} 2>/dev/null) ; then
vim ${file}
else
echo "no '${1}' found in path, looking for shell function"
open_shell_function "${1}"
fi
}
complete -c vc

Can zsh or bash expand history expressions referring to directories?

For example, let's suppose I just copied something:
mv foo_file.txt ~/to/some/long/path/that/i/do/not/want/to/retype
and I'd like to use history substitution like so:
mv bar_file.txt !!:2
I'm surprised that zsh is not expanding the !!:2 for me when I hit [tab]. In a more complex reference to a historical argument I might really want the expansion before I hit return, just so I know with certainty that I referred to the correct argument. Is there any way to make it do that? (I would expect that to be the default behavior. Is it the default behavior, that I have somehow inadvertently disabled or broken?)
If zsh can't do it, can bash?
UPDATE: zsh will expand the history expression if it refers to a file, but not a directory:
mv foo_file.txt foo_bar_file.txt
mv bar_file.txt !!:2[TAB]
It will expand it if it is just an arbitrary string:
echo one two three four
echo !!:1[TAB]
But not if you're trying to move something to a directory. It looks more and more like this must be a bug.
I am using zsh in cygwin:
$ zsh --version
zsh 4.3.12 (i686-pc-cygwin)
$ setopt
interactive
monitor
shinstdin
zle
I just tried the following:
$ touch foo_file.txt bar_file.txt
$ mkdir -p ~/to/some/long/path/that/i/do/not/want/to/retype
$ mv foo_file.txt ~/to/some/long/path/that/i/do/not/want/to/retype
I then tried the tab completion mentioned above:
$ mv bar_file.txt !!:2[TAB]
and it worked fine, the last argument being expanded as follows:
$ mv bar_file.txt ~/to/some/long/path/that/i/do/not/want/to/retype
You can pseudo-hack it in bash:
$ shopt -s histreedit
$ shopt -s histverify
Then, to actually try an expansion:
$ echo !!:2 [now hit enter]
$ echo histverify
Now you can't do tab expansion in bash. Unequivocally no. That's because of the order in which bash expansion is processed.
Works perfectly for me with zsh 4.3.17. Sounds like you probably have a bug which might be worth reporting on the zsh-user mailing list. However there are at least five other keybindings which should accomplish what you want: C-x * which is by default bound to expand-word, and Esc Space or Meta-Space or Esc ! or Meta-! which are all bound to expand-history by default. (Meta means the Alt key for many people, although it depends on your terminal setup.)
Having said that, Esc . (or Meta-. or Alt-.) is a nicer way of retrieving the last word from the previous line in the history, since it provides instant visual feedback. You can also choose the last word from older lines by repeatedly pressing the keyboard shortcut, or even the n th last word on a previous line by prefixing the shortcut with Alt-n (or Meta-n or Esc n). So for example to retrieve the penultimate word from the 3rd newest line of history, the sequence would be:
Meta-. (goes back one line of history, selecting the last word from that line)
Meta-. (goes back another, again selecting the last word)
Meta-2 Meta-. (goes back another, but this time selects the penultimate word from that line)
I've tried what you've described, and I don't think bash supports this either.
In bash, Alt-. is typically bound to yank-last-arg, which will give you what you want.
Here is a link to the whole list of history related commands that can be bound to keystrokes
http://www.gnu.org/software/bash/manual/bashref.html#Commands-For-History
For example,
ls /etc/passwd /etc/group
cat
# press Alt+. (Alt dot) here, bash adds /etc/group
cat /etc/group
# press space, then press Alt+1 Alt+.
# bash adds the first argument of the previous command /etc/passwd
cat /etc/group /etc/passwd

Best(?) way to make a popup menu for semi-portable shell scripts?

Basically I would love to say:
echo `grep ^foo /usr/share/dict/words | popup_menu`
...and have some type of keyboard navigable menu popup or selection tool, very similar to how vim's ":Explore" mechanism works.
Extreme bonus points for "easy and works pretty much everywhere with standard tools"
Also acceptable is "needs some sort of extra config file or 5-10 line shell script"
Less acceptable is "go download this perl library or 100 line python script, etc..." at that point, I would rather just try to find some actual program / package to install and list it as a hard dependency. But if you can come up with a 2-5 line perl / python script that doesn't require tracking down libraries that'd probably work too.
I have investigated:
Dialog - appears more geared towards "shell application" instead of ad-hoc scripting (looks like there might be a way to make it do what I want, though), drawback is that it overwrites the current screen state
Curses - seems like it targets "C" or would need to be used as part of a perl / python library, would have to write my own menu program using this
bash "select" builtin - works via number selection, not keyboard navigation, is a little awkward to use but fairly close
Vim - "grep ^foo /usr/share/dict/words | vim -" ... this gets you surprisingly close, just missing "bind the enter key to print current line to terminal and exit"
...so, how do I go about making or finding a decent, simple, ad-hoc menu maker for use in bash scripts and when I'm being lazy on the command line?
... git checkout -b `git branch -a | menu`
... ssh `grep foo /etc/hosts | menu`
... rm `ls | menu` # ignore obvious quoting issues with this...
Edit: thanks for the answers so far, but want to re-emphasize that I'm looking for ASCII / text menus (not xwindows). I'm trying a few things out locally but nothing is hitting the sweet spot yet.
zenity is very simple to use in scripts but it creates GUI windows. Not sure if it applicable for you...
You can use the --keep-tite option to dialog if your version has it and your terminal supports alternate screens. For example, xterm and some xterm-compatible terminals do.
With this option, output is switched to the alternate screen, the dialog is displayed and when it's closed, the screen switches back showing the previous contents. This is the way vim works, for example.
The dialog will still completely occupy the screen however.
The "ti" and "te" in "keep-tite" represent the termcap codes that are used to bracket a program that uses cursor motion. The corresponding terminfo codes are smcup and rmcup. See man 5 termcap, man 5 terminfo and man xterm (Other Features).
You could also do it yourself something like this:
tput smcup
# bash select menu
tput rmcup
After thorough investigation, the winner of best(?) way to make a popup menu is as follows:
select f in aaa bbb ccc ddd ; do echo $f ; break ; done
It isn't actually a popup menu per-se but you get the best bang for your buck as far as using standard unix-isms and it is pretty much universally available. Wrapping it in a simple shell script is easy to do wherever you are and means you can reliably integrate its benefits into your workflow.
$ cat ~/bin/menu.sh
#!/bin/sh
ALL=`cat`
select FOO in $ALL ; do echo $FOO ; break ; done
$ ls /usr | ~/bin/menu.sh
1) bin 3) include 5) lib64 7) sbin 9) src
2) games 4) lib 6) local 8) share
#? 2
games
In actuality though, you want to use the "select f in ..." idiom as a fallback for when the dialog command isn't available. The following shell / dialog script is kindof ugly but gets the job done as far as providing the same inputs and outputs as above but with a more comfortable user interface.
$ cat ~/bin/gui-menu.sh
#!/bin/sh
# get stdin
ALL=`cat`
# number the lines
SPLITTED=$( echo $ALL | sed 's/ /\n/g' | awk -- '{print NR, $0 }' )
# prompt via dialog (output-fd=1 is so that dialog gui doesn't go to subshell)
OUT=$( dialog --output-fd 1 --ok-label Select --menu Choose 0 50 22 $SPLITTED )
EXIT_CODE=$?
# handle escape / cancel buttons
if [ "1" = "$EXIT_CODE" ] ; then exit 1 ; fi
if [ "255" = "$EXIT_CODE" ] ; then exit 1 ; fi
# extract text corresponding to user's numeric selection
CHOSEN=$( echo $ALL | sed 's/ /\n/g' | awk -- "NR==$OUT {print \$0 }" )
# print result
echo $CHOSEN
...it is used exactly as the above "menu.sh" but prompts a gui instead of numerically. It's relatively easy to expand the above to allow dialog multiple checkboxes (very inefficiently, probably n^2-ish in the below implementation), which is show here:
$ cat ~/bin/gui-multiselect.sh
#!/bin/sh
# get stdin
ALL=`cat`
# number the lines
SPLITTED=$( echo $ALL | sed 's/ /\n/g' | awk -- '{print NR, $0, 0 }' )
# prompt via dialog (output-fd=1 is so that dialog gui doesn't go to subshell)
OUT=$(dialog --output-fd 1 --ok-label Select --separate-output --checklist Choose 0 50 22 $SPLITTED)
EXIT_CODE=$?
# handle escape / cancel buttons
if [ "1" = "$EXIT_CODE" ] ; then exit 1 ; fi
if [ "255" = "$EXIT_CODE" ] ; then exit 1 ; fi
# loop through selected numbers
for X in $OUT ; do
# inefficiently print out the text corresponding to the selections
CHOSEN=$( echo $ALL | sed 's/ /\n/g' | awk -- "NR==$X {print \$0 }" )
echo $CHOSEN
done;
And third place goes to Joey Hess's "vipe" interactive pipeline editor (from "moreutils" package), which lets you edit a pipeline and pass its output back out.
echo `ls | vipe`
The above command isn't quite a dialog box (can't just use up / down arrows and press enter, actually have to delete all the lines you don't want) but it is useful because it handles both interactive single and multi-select use cases and is just an all around interesting tool.
For GUI selection, zenity as referenced by Jack looks like a winner as far as ease of use compared to dialog ... dialog unfortunately doesn't "ad-hoc" very well but combining dialog with a "select f in ..." fallback is what best matches my needs.
If you know that your script will run on a specific distribution of linux then you can use the programs they've already developed for doing notification popups. I was looking for the same thing here: linux command line popup

Handle special characters in bash for...in loop

Suppose I've got a list of files
file1
"file 1"
file2
a for...in loop breaks it up between whitespace, not newlines:
for x in $( ls ); do
echo $x
done
results:
file
1
file1
file2
I want to execute a command on each file. "file" and "1" above are not actual files. How can I do that if the filenames contains things like spaces or commas?
It's a little trickier than I think find -print0 | xargs -0 could handle, because I actually want the command to be something like "convert input/file1.jpg .... output/file1.jpg" so I need to permutate the filename in the process.
Actually, Mark's suggestion works fine without even doing anything to the internal field separator. The problem is running ls in a subshell, whether by backticks or $( ) causes the for loop to be unable to distinguish between spaces in names. Simply using
for f in *
instead of the ls solves the problem.
#!/bin/bash
for f in *
do
echo "$f"
done
UPDATE BY OP: this answer sucks and shouldn't be on top ... #Jordan's post below should be the accepted answer.
one possible way:
ls -1 | while read x; do
echo $x
done
I know this one is LONG past "answered", and with all due respect to eduffy, I came up with a better way and I thought I'd share it.
What's "wrong" with eduffy's answer isn't that it's wrong, but that it imposes what for me is a painful limitation: there's an implied creation of a subshell when the output of the ls is piped and this means that variables set inside the loop are lost after the loop exits. Thus, if you want to write some more sophisticated code, you have a pain in the buttocks to deal with.
My solution was to take the "readline" function and write a program out of it in which you can specify any specific line number that you may want that results from any given function call. ... As a simple example, starting with eduffy's:
ls_output=$(ls -1)
# The cut at the end of the following line removes any trailing new line character
declare -i line_count=$(echo "$ls_output" | wc -l | cut -d ' ' -f 1)
declare -i cur_line=1
while [ $cur_line -le $line_count ] ;
do
# NONE of the values in the variables inside this do loop are trapped here.
filename=$(echo "$ls_output" | readline -n $cur_line)
# Now line contains a filename from the preceeding ls command
cur_line=cur_line+1
done
Now you have wrapped up all the subshell activity into neat little contained packages and can go about your shell coding without having to worry about the scope of your variable values getting trapped in subshells.
I wrote my version of readline in gnuc if anyone wants a copy, it's a little big to post here, but maybe we can find a way...
Hope this helps,
RT

What is your single most favorite command-line trick using Bash? [closed]

It's difficult to tell what is being asked here. This question is ambiguous, vague, incomplete, overly broad, or rhetorical and cannot be reasonably answered in its current form. For help clarifying this question so that it can be reopened, visit the help center.
Closed 11 years ago.
Locked. This question and its answers are locked because the question is off-topic but has historical significance. It is not currently accepting new answers or interactions.
We all know how to use <ctrl>-R to reverse search through history, but did you know you can use <ctrl>-S to forward search if you set stty stop ""? Also, have you ever tried running bind -p to see all of your keyboard shortcuts listed? There are over 455 on Mac OS X by default.
What is your single most favorite obscure trick, keyboard shortcut or shopt configuration using bash?
Renaming/moving files with suffixes quickly:
cp /home/foo/realllylongname.cpp{,-old}
This expands to:
cp /home/foo/realllylongname.cpp /home/foo/realllylongname.cpp-old
cd -
It's the command-line equivalent of the back button (takes you to the previous directory you were in).
Another favorite:
!!
Repeats your last command. Most useful in the form:
sudo !!
My favorite is '^string^string2' which takes the last command, replaces string with string2 and executes it
$ ehco foo bar baz
bash: ehco: command not found
$ ^ehco^echo
foo bar baz
Bash command line history guide
rename
Example:
$ ls
this_has_text_to_find_1.txt
this_has_text_to_find_2.txt
this_has_text_to_find_3.txt
this_has_text_to_find_4.txt
$ rename 's/text_to_find/been_renamed/' *.txt
$ ls
this_has_been_renamed_1.txt
this_has_been_renamed_2.txt
this_has_been_renamed_3.txt
this_has_been_renamed_4.txt
So useful
I'm a fan of the !$, !^ and !* expandos, returning, from the most recent submitted command line: the last item, first non-command item, and all non-command items. To wit (Note that the shell prints out the command first):
$ echo foo bar baz
foo bar baz
$ echo bang-dollar: !$ bang-hat: !^ bang-star: !*
echo bang-dollar: baz bang-hat: foo bang-star: foo bar baz
bang-dollar: baz bang-hat: foo bang-star: foo bar baz
This comes in handy when you, say ls filea fileb, and want to edit one of them: vi !$ or both of them: vimdiff !*. It can also be generalized to "the nth argument" like so:
$ echo foo bar baz
$ echo !:2
echo bar
bar
Finally, with pathnames, you can get at parts of the path by appending :h and :t to any of the above expandos:
$ ls /usr/bin/id
/usr/bin/id
$ echo Head: !$:h Tail: !$:t
echo Head: /usr/bin Tail: id
Head: /usr/bin Tail: id
When running commands, sometimes I'll want to run a command with the previous ones arguments. To do that, you can use this shortcut:
$ mkdir /tmp/new
$ cd !!:*
Occasionally, in lieu of using find, I'll break-out a one-line loop if I need to run a bunch of commands on a list of files.
for file in *.wav; do lame "$file" "$(basename "$file" .wav).mp3" ; done;
Configuring the command-line history options in my .bash_login (or .bashrc) is really useful. The following is a cadre of settings that I use on my Macbook Pro.
Setting the following makes bash erase duplicate commands in your history:
export HISTCONTROL="erasedups:ignoreboth"
I also jack my history size up pretty high too. Why not? It doesn't seem to slow anything down on today's microprocessors.
export HISTFILESIZE=500000
export HISTSIZE=100000
Another thing that I do is ignore some commands from my history. No need to remember the exit command.
export HISTIGNORE="&:[ ]*:exit"
You definitely want to set histappend. Otherwise, bash overwrites your history when you exit.
shopt -s histappend
Another option that I use is cmdhist. This lets you save multi-line commands to the history as one command.
shopt -s cmdhist
Finally, on Mac OS X (if you're not using vi mode), you'll want to reset <CTRL>-S from being scroll stop. This prevents bash from being able to interpret it as forward search.
stty stop ""
How to list only subdirectories in the current one ?
ls -d */
It's a simple trick, but you wouldn't know how much time I needed to find that one !
ESC.
Inserts the last arguments from your last bash command. It comes in handy more than you think.
cp file /to/some/long/path
cd ESC.
Sure, you can "diff file1.txt file2.txt", but Bash supports process substitution, which allows you to diff the output of commands.
For example, let's say I want to make sure my script gives me the output I expect. I can just wrap my script in <( ) and feed it to diff to get a quick and dirty unit test:
$ cat myscript.sh
#!/bin/sh
echo -e "one\nthree"
$
$ ./myscript.sh
one
three
$
$ cat expected_output.txt
one
two
three
$
$ diff <(./myscript.sh) expected_output.txt
1a2
> two
$
As another example, let's say I want to check if two servers have the same list of RPMs installed. Rather than sshing to each server, writing each list of RPMs to separate files, and doing a diff on those files, I can just do the diff from my workstation:
$ diff <(ssh server1 'rpm -qa | sort') <(ssh server2 'rpm -qa | sort')
241c240
< kernel-2.6.18-92.1.6.el5
---
> kernel-2.6.18-92.el5
317d315
< libsmi-0.4.5-2.el5
727,728d724
< wireshark-0.99.7-1.el5
< wireshark-gnome-0.99.7-1.el5
$
There are more examples in the
Advanced Bash-Scripting Guide at http://tldp.org/LDP/abs/html/process-sub.html.
My favorite command is "ls -thor"
It summons the power of the gods to list the most recently modified files in a conveniently readable format.
More of a novelty, but it's clever...
Top 10 commands used:
$ history | awk '{print $2}' | awk 'BEGIN {FS="|"}{print $1}' | sort | uniq -c | sort -nr | head
Sample output:
242 git
83 rake
43 cd
33 ss
24 ls
15 rsg
11 cap
10 dig
9 ping
3 vi
^R reverse search. Hit ^R, type a fragment of a previous command you want to match, and hit ^R until you find the one you want. Then I don't have to remember recently used commands that are still in my history. Not exclusively bash, but also: ^E for end of line, ^A for beginning of line, ^U and ^K to delete before and after the cursor, respectively.
I often have aliases for vi, ls, etc. but sometimes you want to escape the alias. Just add a back slash to the command in front:
Eg:
$ alias vi=vim
$ # To escape the alias for vi:
$ \vi # This doesn't open VIM
Cool, isn't it?
Here's a couple of configuration tweaks:
~/.inputrc:
"\C-[[A": history-search-backward
"\C-[[B": history-search-forward
This works the same as ^R but using the arrow keys instead. This means I can type (e.g.) cd /media/ then hit up-arrow to go to the last thing I cd'd to inside the /media/ folder.
(I use Gnome Terminal, you may need to change the escape codes for other terminal emulators.)
Bash completion is also incredibly useful, but it's a far more subtle addition. In ~/.bashrc:
if [ -f /etc/bash_completion ]; then
. /etc/bash_completion
fi
This will enable per-program tab-completion (e.g. attempting tab completion when the command line starts with evince will only show files that evince can open, and it will also tab-complete command line options).
Works nicely with this also in ~/.inputrc:
set completion-ignore-case on
set show-all-if-ambiguous on
set show-all-if-unmodified on
I use the following a lot:
The :p modifier to print a history result. E.g.
!!:p
Will print the last command so you can check that it's correct before running it again. Just enter !! to execute it.
In a similar vein:
!?foo?:p
Will search your history for the most recent command that contained the string 'foo' and print it.
If you don't need to print,
!?foo
does the search and executes it straight away.
I have got a secret weapon : shell-fu.
There are thousand of smart tips, cool tricks and efficient recipes that most of the time fit on a single line.
One that I love (but I cheat a bit since I use the fact that Python is installed on most Unix system now) :
alias webshare='python -m SimpleHTTPServer'
Now everytime you type "webshare", the current directory will be available through the port 8000. Really nice when you want to share files with friends on a local network without usb key or remote dir. Streaming video and music will work too.
And of course the classic fork bomb that is completely useless but still a lot of fun :
$ :(){ :|:& };:
Don't try that in a production server...
You can use the watch command in conjunction with another command to look for changes. An example of this was when I was testing my router, and I wanted to get up-to-date numbers on stuff like signal-to-noise ratio, etc.
watch --interval=10 lynx -dump http://dslrouter/stats.html
type -a PROG
in order to find all the places where PROG is available, usually somewhere in ~/bin
rather than the one in /usr/bin/PROG that might have been expected.
I like to construct commands with echo and pipe them to the shell:
$ find dir -name \*~ | xargs echo rm
...
$ find dir -name \*~ | xargs echo rm | ksh -s
Why? Because it allows me to look at what's going to be done before I do it. That way if I have a horrible error (like removing my home directory), I can catch it before it happens. Obviously, this is most important for destructive or irrevocable actions.
When downloading a large file I quite often do:
while ls -la <filename>; do sleep 5; done
And then just ctrl+c when I'm done (or if ls returns non-zero). It's similar to the watch program but it uses the shell instead, so it works on platforms without watch.
Another useful tool is netcat, or nc. If you do:
nc -l -p 9100 > printjob.prn
Then you can set up a printer on another computer but instead use the IP address of the computer running netcat. When the print job is sent, it is received by the computer running netcat and dumped into printjob.prn.
pushd and popd almost always come in handy
One preferred way of navigating when I'm using multiple directories in widely separate places in a tree hierarchy is to use acf_func.sh (listed below). Once defined, you can do
cd --
to see a list of recent directories, with a numerical menu
cd -2
to go to the second-most recent directory.
Very easy to use, very handy.
Here's the code:
# do ". acd_func.sh"
# acd_func 1.0.5, 10-nov-2004
# petar marinov, http:/geocities.com/h2428, this is public domain
cd_func ()
{
local x2 the_new_dir adir index
local -i cnt
if [[ $1 == "--" ]]; then
dirs -v
return 0
fi
the_new_dir=$1
[[ -z $1 ]] && the_new_dir=$HOME
if [[ ${the_new_dir:0:1} == '-' ]]; then
#
# Extract dir N from dirs
index=${the_new_dir:1}
[[ -z $index ]] && index=1
adir=$(dirs +$index)
[[ -z $adir ]] && return 1
the_new_dir=$adir
fi
#
# '~' has to be substituted by ${HOME}
[[ ${the_new_dir:0:1} == '~' ]] && the_new_dir="${HOME}${the_new_dir:1}"
#
# Now change to the new dir and add to the top of the stack
pushd "${the_new_dir}" > /dev/null
[[ $? -ne 0 ]] && return 1
the_new_dir=$(pwd)
#
# Trim down everything beyond 11th entry
popd -n +11 2>/dev/null 1>/dev/null
#
# Remove any other occurence of this dir, skipping the top of the stack
for ((cnt=1; cnt <= 10; cnt++)); do
x2=$(dirs +${cnt} 2>/dev/null)
[[ $? -ne 0 ]] && return 0
[[ ${x2:0:1} == '~' ]] && x2="${HOME}${x2:1}"
if [[ "${x2}" == "${the_new_dir}" ]]; then
popd -n +$cnt 2>/dev/null 1>/dev/null
cnt=cnt-1
fi
done
return 0
}
alias cd=cd_func
if [[ $BASH_VERSION > "2.05a" ]]; then
# ctrl+w shows the menu
bind -x "\"\C-w\":cd_func -- ;"
fi
Expand complicated lines before hitting the dreaded enter
Alt+Ctrl+e — shell-expand-line (may need to use Esc, Ctrl+e on your keyboard)
Ctrl+_ — undo
Ctrl+x, * — glob-expand-word
$ echo !$ !-2^ * Alt+Ctrl+e
$ echo aword someotherword * Ctrl+_
$ echo !$ !-2^ * Ctrl+x, *
$ echo !$ !-2^ LOG Makefile bar.c foo.h
&c.
I've always been partial to:
ctrl-E # move cursor to end of line
ctrl-A # move cursor to beginning of line
I also use shopt -s cdable_vars, then you can create bash variables to common directories. So, for my company's source tree, I create a bunch of variables like:
export Dcentmain="/var/localdata/p4ws/centaur/main/apps/core"
then I can change to that directory by cd Dcentmain.
pbcopy
This copies to the Mac system clipboard. You can pipe commands to it...try:
pwd | pbcopy
$ touch {1,2}.txt
$ ls [12].txt
1.txt 2.txt
$ rm !:1
rm [12].txt
$ history | tail -10
...
10007 touch {1,2}.txt
...
$ !10007
touch {1,2}.txt
$ for f in *.txt; do mv $f ${f/txt/doc}; done
Using 'set -o vi' from the command line, or better, in .bashrc, puts you in vi editing mode on the command line. You start in 'insert' mode so you can type and backspace as normal, but if you make a 'large' mistake you can hit the esc key and then use 'b' and 'f' to move around as you do in vi. cw to change a word. Particularly useful after you've brought up a history command that you want to change.
Similar to many above, my current favorite is the keystroke [alt]. (Alt and "." keys together) this is the same as $! (Inserts the last argument from the previous command) except that it's immediate and for me easier to type. (Just can't be used in scripts)
eg:
mkdir -p /tmp/test/blah/oops/something
cd [alt].
String multiple commands together using the && command:
./run.sh && tail -f log.txt
or
kill -9 1111 && ./start.sh

Resources