Please consider the following file inject.sh with the following line:
#!/bin/bash
bind '"\e[0n": "echo test"'; printf '\e[5n'
When running source inject.sh it injects the text 'echo test' on a new line (not echo). This works correctly, as per one of the suggestions here: https://unix.stackexchange.com/a/213821
I want to replace the "echo test" part with all the command line arguments that might be provided to the script, so with $# basically. However I am having a hard time adding it into the command. I have tried with:
#!/bin/bash
bind '"\e[0n": "'$#'"'; printf '\e[5n'
But it only works if only one argument is passed to the command. So for example:
source inject.sh ls --> bash-3.2$ ls| OK (| is the cursor)
source inject.sh echo foo --> bash-3.2$ echo| NOT OK (does not print 'foo' and additionally it messes up the terminal, can't print some letters anymore)
Not sure where the problem is... Maybe wrong string concatenation?
Note this is a bash specific problem, not zsh, fish or something else. But for reference I am trying to emulate the zsh behavior of print -z $# echo foo
You can use:
#!/bin/bash
bind '"\e[0n": "'"$*"'"'; printf '\e[5n'
When you call source inject.sh foo bar baz, you want to concatenate them and put them in the same argument:
bind '"\e[0n": "foo bar baz"'
But you instead, you were splitting it across three invalid arguments:
bind '"\e[0n": "foo' 'bar' 'baz"'
Ways to debug this includes ShellCheck, which spots both problems:
In inject.sh line 2:
bind '"\e[0n": "'$#'"'; printf '\e[5n'
^-- SC2068: Double quote array expansions to avoid re-splitting elements.
^-- SC2145: Argument mixes string and array. Use * or separate argument.
And set -x which shows how the command is being mangled:
++ bind '"\e[0n": "foo' bar 'baz"' # Invalid attempt
++ bind '"\e[0n": "foo bar baz"' # Valid command
Related
On many websites, "$" is written at the beginning when introducing the Linux command.
But of course, this will result in a "$: command not found" error.
To avoid this it is necessary to delete or replace "$" every time, but it is troublesome.
So, if the beginning of the input command is "$", I think that it would be good if I could ignore "$", is it possible?
If you really need this, you can create a file in a directory that is in your $PATH. The file will be named $ and will contain
#!/bin/bash
exec "$#"
Make it executable, then you can do
$ echo foo bar
foo bar
$ $ echo foo bar
foo bar
$ $ $ echo foo bar
foo bar
$ $ $ $ echo foo bar
foo bar
Note that this does not affect variable expansion in any way. It only interprets a standalone $ as the first word in the command line as a valid command.
I just noticed a problem with this: It works for calling commands, but not for shell-specific constructs:
$ foo=bar
$ echo $foo
bar
$ $ foo=qux
/home/jackman/bin/$: line 2: exec: foo=qux: not found
and
$ { echo hello; }
hello
$ $ { echo hello; }
bash: syntax error near unexpected token `}'
In summary, everyone else is right: use your mouse better.
Yes it is possible for you to ignore the command prompt, when copying commands from web sites. Use the shift and arrow keys to ignore the prompt. This will also help you to ignore the use of the # sign, which is used to indicate commands, which need administrative privileges.
I have a script foo.sh
CMD='export FOO="BAR"'
$CMD
echo $FOO
It works as expected
>./foo.sh
"BAR"
Now I want to change FOO variable to BAR BAR. So I get script
CMD='export FOO="BAR BAR"'
$CMD
echo $FOO
When I run it I expect to get "BAR BAR", but I get
./foo.sh: line 2: export: `BAR"': not a valid identifier
"BAR
How I can deal with that?
You should not use a variable as a command by just calling it (like in your $CMD). Instead, use eval to evaluate a command stored in a variable. Only by doing this, a true evaluation step with all the shell logic is performed:
eval "$CMD"
(And use double quotes to pass the command to eval.)
Just don't do that.
And read Bash FAQ #50
I'm trying to save a command so I can run it later without having to repeat it each time
If you want to put a command in a container for later use, use a
function. Variables hold data, functions hold code.
pingMe() {
ping -q -c1 "$HOSTNAME"
}
[...]
if pingMe; then ..
The proper way to do that is to use an array instead:
CMD=(export FOO="BAR BAR")
"${CMD[#]}"
I'm trying to write a "phone home" script, which will log the exact command line (including any single or double quotes used) into a MySQL database. As a backend, I have a cgi script which wraps the database. The scripts themselves call curl on the cgi script and include as parameters various arguments, including the verbatim command line.
Obviously I have quite a variety of quote escaping to do here and I'm already stuck at the bash stage. At the moment, I can't even get bash to print verbatim the arguments provided:
Desired output:
$ ./caller.sh -f -hello -q "blah"
-f hello -q "blah"
Using echo:
caller.sh:
echo "$#"
gives:
$ ./caller.sh -f -hello -q "blah"
-f hello -q blah
(I also tried echo $# and echo $*)
Using printf %q:
caller.sh:
printf %q $#
printf "\n"
gives:
$ ./caller.sh -f hello -q "blah"
-fhello-qblah
(I also tried print %q "$#")
I would welcome not only help to fix my bash problem, but any more general advice on implementing this "phone home" in a tidier way!
There is no possible way you can write caller.sh to distinguish between these two commands invoked on the shell:
./caller.sh -f -hello -q "blah"
./caller.sh -f -hello -q blah
There are exactly equivalent.
If you want to make sure the command receives special characters, surround the argument with single quotes:
./caller.sh -f -hello -q '"blah"'
Or if you want to pass just one argument to caller.sh:
./caller.sh '-f -hello -q "blah"'
You can get this info from the shell history:
function myhack {
line=$(history 1)
line=${line#* }
echo "You wrote: $line"
}
alias myhack='myhack #'
Which works as you describe:
$ myhack --args="stuff" * {1..10} $PATH
You wrote: myhack --args="stuff" * {1..10} $PATH
However, quoting is just the user's way of telling the shell how to construct the program's argument array. Asking to log how the user quotes their arguments is like asking to log how hard the user punched the keys and what they were wearing at the time.
To log a shell command line which unambiguously captures all of the arguments provided, you don't need any interactive shell hacks:
#!/bin/bash
line=$(printf "%q " "$#")
echo "What you wrote would have been indistinguishable from: $line"
I understand you want to capture the arguments given by the caller.
Firstly, quotes used by the caller are used to protect during the interpretation of the call. But they do not exist as argument.
An example: If someone call your script with one argument "Hello World!" with two spaces between Hello and World. Then you have to protect ALWAYS $1 in your script to not loose this information.
If you want to log all arguments correctly escaped (in the case where they contains, for example, consecutive spaces...) you HAVE to use "$#" with double quotes. "$#" is equivalent to "$1" "$2" "$3" "$4" etc.
So, to log arguments, I suggest the following at the start of the caller:
i=0
for arg in "$#"; do
echo "arg$i=$arg"
let ++i
done
## Example of calls to the previous script
#caller.sh '1' "2" 3 "4 4" "5 5"
#arg1=1
#arg2=2
#arg3=3
#arg4=4 4
#arg5=5 5
#Flimm is correct, there is no way to distinguish between arguments "foo" and foo, simply because the quotes are removed by the shell before the program receives them. What you need is "$#" (with the quotes).
I am trying to pass arguments from a bash script to an executable and one of them contains spaces. I have been searching how to solve this, but I cannot find the right way to do it. Minimal example with a script called first and a script called second.
first script:
#!/bin/bash
# first script
ARGS="$#"
./second $ARGS
second script:
#!/bin/bash
# second script
echo "got $# arguments"
Now if I run it like this, I get the following results:
$ ./first abc def
got 2 args
$ ./first "abc def"
got 2 args
$ ./first 'abc def'
got 2 args
How can I make it so, that the second script also only receives one argument?
You can't do it using an intermediate variable. If you quote it will always pass 1 argument, if you don't you will lose the quotes.
However, you can pass the arguments directly if you don't use the variable like this:
./second "$#"
$ ./first abc def
got 2 arguments
$ ./first "abc def"
got 1 arguments
Alternately, you can use an array to store the arguments like this:
#!/bin/bash
# first script
ARGS=("$#")
./second "${ARGS[#]}"
IFS is your friend .
#!/bin/bash
# first script
ARGS="$#"
IFS=$(echo -en "\n\b")
./second $ARGS
IFS stands for Internal Field Separator ...
I need my bashscript to cat all of its parameters into a file. I tried to use cat for this because I need to add a lot of lines:
#!/bin/sh
cat > /tmp/output << EOF
I was called with the following parameters:
"$#"
or
$#
EOF
cat /tmp/output
Which leads to the following output
$./test.sh "dsggdssgd" "dsggdssgd dgdsdsg"
I was called with the following parameters:
"dsggdssgd dsggdssgd dgdsdsg"
or
dsggdssgd dsggdssgd dgdsdsg
I want neither of these two things: I need the exact quoting which was used on the command line. How can I achieve this? I always thought $# does everything right in regards to quoting.
Well, you are right that "$#" has the args including the whitespace in each arg. However, since the shell performs quote removal before executing a command, you can never know how exactly the args were quoted (e.g. whether with single or double quotes, or backslashes or any combination thereof--but you shouldn't need to know, since all you should care for are the argument values).
Placing "$#" in a here-document is pointless because you lose the information about where each arg starts and ends (they're joined with a space inbetween). Here's a way to see just this:
$ cat test.sh
#!/bin/sh
printf 'I was called with the following parameters:\n'
printf '"%s"\n' "$#"
$ ./test.sh "dsggdssgd" "dsggdssgd dgdsdsg"
I was called with the following parameters:
"dsggdssgd"
"dsggdssgd dgdsdsg"
Try:
#!/bin/bash
for x in "$#"; do echo -ne "\"$x\" "; done; echo
To see what's interpreted by Bash, use:
bash -x ./script.sh
or add this to the beginning of your script:
set -x
You might want add this on the parent script.