Recall all arguments from a previously executed command in bash shell - bash

This link gives the pointer on how to recall the arguments of the last successfully executed command in the command line.
I wanted to access all the arguments of a particular command from the shell history.
To me only this syntax works
ls f1 f2 f3
file !ls:1-3 --> file f1 f2 f3
And, if I use !* which should give all the arguments of previous command throws error
file !ls:!*
-bash: !: unrecognized history modifier
I can only use this syntax (i.e) all arguments of last executed command.
file !*
Problem with the above methods are if I had executed ls with some option, for eg: ls -l, then the file command would have thrown a different output as the option of ls would be considered as first argument in this case.

Try leaving out the second !:
$ ls foo bar baz
foo bar baz
$ echo !ls:*
echo foo bar baz
foo bar baz

!ls returns the last ls command. If you want the second to last or even an older ls execution you can use history to retrieve the command you want
history | grep ls
355 ls foo bar baz
446 ls -a
447 ls -ah
And then use #Jon solution or yours to get the arguemnts
echo !355:*

Related

When I run reload a bash terminal using the source command, why can I not chain commands using &&?

I just modified ~/.bash_profile to include the following alias:
alias ngrep='grep -v grep'
I then went to an already-open terminal session and ran the following:
source ~/.bash_profile && ps aux | grep mysql | ngrep
The output was:
-bash: ngrep: command not found
However, I then immediately ran ngrep and it ran without errors.
I'm looking to understand Terminal better. Why can I not chain an alias I just added after sourcing the bash profile using &&?
On a Mac running Mojave, with the standard terminal and bash.
Aliases are simple prefix substitutions that take place before syntax is parsed. This gives them powers other constructs don't have (albeit powers which are rarely needed or appropriate) -- you can alias something to content that's subsequently parsed as syntax -- but it also constrains them: Because a compound command needs to be parsed before it can be executed, the ngrep command is parsed before the source command is executed, so the alias is not yet loaded at the point in time when it would need to be to take effect.
As a simple demonstration (thanks to a comment by #chepner):
alias foo=echo; foo hi
foo bye
...will emit:
-bash: foo: command not found
bye
...because the alias was not in place when the first line (alias foo=echo; foo hi) was parsed, but is in place for the line foo bye. (The alias is in place when foo hi is run, but the command has already been split out into the command foo with the argument hi; there's no remaining opportunity to change foo to echo, so the fact that the alias is defined at this time has no impact on execution).
You wouldn't have this problem with a function:
# note that you can't run this in a shell that previously had ngrep defined as an alias
# ...unless you unalias it first!
ngrep() { grep -v grep "$#"; }
...doesn't require recognition at parse time, so you can use it in a one-liner as shown in the question.
As an indirect solution, consider editing your match pattern. e.g.:
$: ps -fu $LOGNAME
UID PID PPID TTY STIME COMMAND
P2759474 6704 10104 pty0 14:26:54 /usr/bin/ps
P2759474 10104 9968 pty0 07:59:11 /usr/bin/bash
P2759474 9968 1 ? 07:59:10 /usr/bin/mintty
$: ps -fu $LOGNAME | grep '/mintty$'
P2759474 9968 1 ? 07:59:10 /usr/bin/mintty
You don't have to grep -v grep if your grep already is specific enough to exclude itself.

Ignore "$" at the beginning of the command in Bash

On many websites, "$" is written at the beginning when introducing the Linux command.
But of course, this will result in a "$: command not found" error.
To avoid this it is necessary to delete or replace "$" every time, but it is troublesome.
So, if the beginning of the input command is "$", I think that it would be good if I could ignore "$", is it possible?
If you really need this, you can create a file in a directory that is in your $PATH. The file will be named $ and will contain
#!/bin/bash
exec "$#"
Make it executable, then you can do
$ echo foo bar
foo bar
$ $ echo foo bar
foo bar
$ $ $ echo foo bar
foo bar
$ $ $ $ echo foo bar
foo bar
Note that this does not affect variable expansion in any way. It only interprets a standalone $ as the first word in the command line as a valid command.
I just noticed a problem with this: It works for calling commands, but not for shell-specific constructs:
$ foo=bar
$ echo $foo
bar
$ $ foo=qux
/home/jackman/bin/$: line 2: exec: foo=qux: not found
and
$ { echo hello; }
hello
$ $ { echo hello; }
bash: syntax error near unexpected token `}'
In summary, everyone else is right: use your mouse better.
Yes it is possible for you to ignore the command prompt, when copying commands from web sites. Use the shift and arrow keys to ignore the prompt. This will also help you to ignore the use of the # sign, which is used to indicate commands, which need administrative privileges.

Bash alias using !$

I found out today that I can write !$ to get the last argument from the last command executed.
Now I'm trying to create an alias using that shortcut and it isn't working at all.
These are the ones I'm trying to create.
alias gal='git add !$'
alias gcl='git checkout !$'
alias sl='sublime !$'
And this is the result output when calling gal or gcl
fatal: pathspec '!$' did not match any files
So it seems like !$ just isn't being replaced by the last argument from the last command in this context.
Is it possible?
Instead of fiddling with Bash's history, you might as well want to use Bash's $_ internal variable: The relevant part of the manual states:
$_: […] expands to the last argument to the previous command, after expansion. […]
For example:
$ touch one two three
$ echo "$_"
three
$ ls
$ echo "$_"
ls
$ a='hello world'
$ echo $a
hello world
$ echo "$_"
world
$ echo "$a"
hello world
$ echo "$_"
hello world
$
In your case, your aliases would look like:
alias gal='git add "$_"'
alias gcl='git checkout "$_"'
alias sl='sublime "$_"'
You can use the bash builtin history command fc: an example
$ alias re_echo='echo $(fc -ln -2 | awk '\''NR==1 {print $NF}'\'')'
$ echo foo
foo
$ re_echo bar
foo bar
$ re_echo baz
bar baz
$ re_echo qux
baz qux

Redirection and pipe behavior in bash vs. zsh

The following command outputs different results depending if it is run in bash or zsh:
ls -l > x | wc -l
If executed in a non-empty directory, bash always gives 0, while zsh gives the right number of files. x contains the output of ls -l, as expected.
Why doesn't it work in bash?
Read the MULTIOS documentation in the zshmisc man page. It's a feature of zsh which causes it to redirect the output to multiple files at the same time, and it can also be a pipe.
e.g.
ls >a >b
will get both a and b populated with the content of the directory.
from the zshmisc documentation:
If the user tries to open a file descriptor for writing more than once, the shell opens the file descriptor as a pipe to a process that copies its input to all the specified outputs, similar to tee, provided the MULTIOS option is set, as it is by default. Thus:
date >foo >bar
writes the date to two files, named foo and bar. Note that a pipe is an implicit redirection; thus
date >foo | cat
writes the date to the file foo, and also pipes it to cat.
To turn it on you do setopt multios, to turn off you do setopt nomultios:
$ setopt nomultios
$ ls -l > x | wc -l
0
$ setopt multios
$ ls -l > x | wc -l
36
The output from ls -l is redirected into a file called 'x'. There is no output to go into the pipe (it's all going into 'x'). This is the way most every standard shell works.
The question here isn't why bash doesn't work, the question is why does zsh do what it does.

Pipe data into shell command expecting a filename

Suppose I have a shell command foo that expects a file such as bar.txt as an argument, and I want to pass a one-line file to foo and then erase it like so:
echo "This is a one line file" > bar.txt
foo bar.txt
rm bar.txt
Is there a way to do this all in a single line of shell script without ever creating the file bar.txt?
You can use Process Substitution:
foo <(echo "This is a one line file")
I'm assuming that foo doesn't try to read stdin. If so, as an alternative to the Process Substitution suggested by Cyrus, you can also do
echo "This is a one line file" | foo /dev/stdin
try command substitution like this
cat $(ls)
where, result of 'ls' will be substituted as an argument for 'cat' to execute

Resources