I have a question regarding the usage of functions in a command in bash. getRegex is my function, it is defined at the end of the the file. The command that I want to use is the following:
COUNT=`grep -rnE 'getRegex' $HOME/new`
Now I tried a lot of different variants but I cannot make it work, even if I split it in 2. The method works correctly if I call it the following way: getRegex. Any idea what I am missing? TIA
The key words to answer are "bash command substitution", which you could find in man bash or google.
By the way, double quotes are really important here.
#!/bin/bash
function my_func () {
echo "no"
}
string="no you don't
no you don't
no you don't
no you don't
no you don't"
COUNT="$( echo "${string}" | grep "$( my_func )" -c )"
echo "${COUNT}"
And
$> ./ok.sh
5
If you're trying to call a bash command within another bash command, the inner command (here getRegex) needs to be enclosed in backticks `` or else it will be interpreted as text. Since you here would have backticks inside backticks, you'll need to escape the inner ones. Try this:
COUNT=`grep -rnE '\`getRegex\`' $HOME/new`
But, through the wonders of POSIX, we can use a different syntax. Anywhere you use backticks, you can also use $(). So to avoid backslash emesis, you could write:
COUNT=$(grep -rnE '$(getRegex)' $HOME/new)
Related
I would like my bash script to check the name of the directory where it is run. Something like:
#!/bin/bash
path=eval 'pwd'
dirname=eval 'basename $path'
But it doesn't work: I get
./foo.sh: line 5: basename $path: command not found
How can I fix it? Also, once I get dirname to contain the correct dirname, I'd like to convert it to lowercase, to test it. I'm able to do this on the command line with awk:
echo $dirname | awk '{print tolower($0)}'
but how do I capture the return value into a variable?
Why not use:
#!/bin/bash
path=`pwd`
dirname=`basename $path | awk '{print tolower($0)}'`
Or if you want to do it as a one liner:
dirname=`pwd | xargs basename | awk '{print tolower($0)}'`
You can rewrite it to
dirname=eval "basename $path"
With single-quotes, you don't get shell expansion, but you want $path getting expanded.
BTW: I'd suggesst using
path=$(basename $path)
It's way more generic and better readable if you do something like
path=$(basename $(pwd))
or to get the lowercase result
path=$(basename $(pwd) | awk '{print tolower($0)}')
or
path=$(basename $(pwd) | tr 'A-Z' 'a-z' )
The form
x=y cmd
means to temporarily set environment variable x to value y and then run cmd, which is how these lines are interpreted:
path=eval 'pwd'
dirname=eval 'basename $path'
That is, they aren't doing what you seem to expect at all, instead setting an environment variable to the literal value eval and then running (or failing to find) a command. As others have said, the way to interpolate the results of a command into a string is to put it inside $(...) (preferred) or `...` (legacy). And, as a general rule, it's safer to wrap those in double quotes (as it is safer to wrap any interpolated reference in quotes).
path="$(pwd)"
dirname="$(basename "$path")"
(Technically, in this case the outer quotes aren't strictly necessary. However, I'd say it's still a good habit to have.)
B=$(echo "Some text that has CAPITAL letters " | awk '{print tolower($0)}')
eval executes command passed to it, but it returns only command exit status code, so you cannot really use it in set operator. The way to go to embed command into set operator either to use right single quotes or $()
So the script will look like this:
#!/bin/bash
curr_path=$(pwd)
echo $curr_path
curr_dir=$(basename $curr_path)
echo $curr_dir
echo $curr_dir | awk '{print tolower($0)}'
Your code doesn't work because you use single quotes rather than double quotes. Single quotes prevent variable expansion, thus $path is not expanded into the path you want to use and is taken as it is, as it if were a string.
Your awk invocation would not work for the same reason as well.
Although you could solve the problem replacing single quotes with double quotes, like this:
#!/bin/bash
path=eval "pwd"
dirname=eval "basename $path"
I would suggest using grave accents instead (). There's no reason to useeval` in this case. Plus, you can also use it to collect the return value you are interested in:
#!/bin/bash
path=`pwd`
dirname=`basename $path`
variable=`echo $dirname | awk "{print tolower($0)}"`
Here's an excerpt from my answer to What platform independent way to find directory of shell executable in shell script? which, in itself, fully answers your question aside from the lowercase part, which, in my opinion, has been duly addressed many times in other answers here.
What's unique about my answer is that when I was attempting to write it for the other question I encountered your exact problem - how do I store the function's results in a variable? Well, as you can see, with some help, I hit upon a pretty simple and very powerful solution:
I can pass the function a sort of messenger variable and dereference any explicit use of the resulting function's argument's $1 name with eval as necessary, and, upon the function routine's completion, I use eval and a backslashed quoting trick to assign my messenger variable the value I desire without ever having to know its name.
In full disclosure, though this was the solution to my problem, it was not by any means my solution. I've had several occasions to visit there before, but some of his descriptions, though probably brilliant, are a little out of my league, and so I thought others might benefit if include my own version of how this works in the previous paragraph. Though of course it was very simple to understand once I did, for this one especially, I had to think long and hard to figure out how it might work. Anyway, you can find that and more at Rich's sh tricks and I have also excerpted the relevant portion of his page below my own answer's excerpt.
...
EXCERPT:
...
Though not strictly POSIX yet, realpath is a GNU core app since 2012. Full disclosure: never heard of it before I noticed it in the info coreutils TOC and immediately thought of [the linked] question, but using the following function as demonstrated should reliably, (soon POSIXLY?), and, I hope, efficiently
provide its caller with an absolutely sourced $0:
% _abs_0() {
> o1="${1%%/*}"; ${o1:="${1}"}; ${o1:=`realpath "${1}"`}; eval "$1=\${o1}";
> }
% _abs_0 ${abs0:="${0}"} ; printf %s\\n "${abs0}"
/no/more/dots/in/your/path2.sh
EDIT: It may be worth highlighting that this solution uses POSIX parameter expansion to first check if the path actually needs expanding and resolving at all before attempting to do so. This should return an absolutely sourced $0via a messenger variable (with the notable exception that it will preserve symlinks) as efficiently as I could imagine it could be done whether or not the path is already absolute.
...
(minor edit: before finding realpath in the docs, I had at least pared down my version of [the version below] not to depend on the time field [as it does in the first ps command], but, fair warning, after testing some I'm less convinced ps is fully reliable in its command path expansion capacity)
On the other hand, you could do this:
ps ww -fp $$ | grep -Eo '/[^:]*'"${0#*/}"
eval "abs0=${`ps ww -fp $$ | grep -Eo ' /'`#?}"
...
And from Rich's sh tricks:
...
Returning strings from a shell function
As can be seen from the above pitfall of command substitution, stdout is not a good avenue for shell functions to return strings to their caller, unless the output is in a format where trailing newlines are insignificant. Certainly such practice is not acceptable for functions meant to deal with arbitrary strings. So, what can be done?
Try this:
func () {
body here
eval "$1=\${foo}"
}
Of course ${foo} could be replaced by any sort of substitution. The key trick here is the eval line and the use of escaping. The “$1” is expanded when the argument to eval is constructed by the main command parser. But the “${foo}” is not expanded at this stage, because the “$” has been quoted. Instead, it’s expanded when eval evaluates its argument. If it’s not clear why this is important, consider how the following would be bad:
foo='hello ; rm -rf /'
dest=bar
eval "$dest=$foo"
But of course the following version is perfectly safe:
foo='hello ; rm -rf /'
dest=bar
eval "$dest=\$foo"
Note that in the original example, “$1” was used to allow the caller to pass the destination variable name as an argument the function. If your function needs to use the shift command, for instance to handle the remaining arguments as “$#”, then it may be useful to save the value of “$1” in a temporary variable at the beginning of the function.
I found this example:
echo -e "This is red->\e[00;31mRED\e[00m"
It works if execute direct, from command line, bu if create file like:
#! /usr/bin/sh
echo -e "This is red->\e[00;31mRED\e[00m"
Doesn't work. How to fix? Or may be possible output in bold?
Please don't use Lua it doesn't installed.
Edit This might be your problem (likely):
#!/bin/bash
echo -e "This is red->\e[00;31mRED\e[00m"
The reason is that sh doesn't have a builtin echo command, that supports escapes.
Alternatively you might invoke your script like
bash ./myscript.sh
Backgrounders
ANSI escape sequences are interpreted by the terminal.
If you run in a pipe/with IO redirected, ouput won't be to a terminal, hence the escapes don't get interpreted.
Hints:
see ansifilter for a tool that can filter ANSI escape sequences (and optionally translate to HTML and others)
use GNU less, e.g. to get ANSI escapes working in a pager:
grep something --colour=always files.* | less -R
Or simply, as I do
# also prevent wrapping long lines
alias less='less -SR'
Use an echo program, not an echo built-in command:
#!/bin/sh
MYECHO="`which echo`"
if <test-whether-MYECHO-empty-and-act-accordingly> ...
...
$MYCHO -e "This is red->\e[00;31mRED\e[00m"
I’m trying to build a command string based to pass in a “-e” flag and another variable into a another base script being call as a subroutine and have run into a strange problem; I’m losing the “-e” portion of the string when I pass it into the subroutine. I create a couple example which illustrate the issue, any help?
This works as you would expect:
$echo "-e $HOSTNAME"
-e ops-wfm
This does NOT; we lose the “-e” because it is interpreted as a special qualifier.
$myFlag="-e $HOSTNAME"; echo $myFlag
ops-wfm
Adding the “\” escape charactor doesn’t work either, I get the correct string with the "\" in front:
$myFlag="\-e $HOSTNAME"; echo $myFlag
\-e ops-wfm
How can I prevent -e being swallowed?
Use double-quotes:
$ myFlag="-e $HOSTNAME"; echo "${myFlag}"
-e myhost.local
I use ${var} rather than $var out of habit as it means that I can add characters after the variable without the shell interpreting them as part of the variable name.
echo may not be the best example here. Most Unix commands will accept -- to mark no more switches.
$ var='-e .bashrc' ; ls -l -- "${var}"
ls: -e .bashrc: No such file or directory
Well, you could put your variable in quotes:
echo "$myFlag"
...making it equivalent to your first example, which, as you say, works just fine.
I have a bash script that recieves a set of files from the user. These files are sometimes under directories with spaces in their names. Unfortunately unlike this question all the filenames are passed via the command line interface. Let's assume the paths are correctly quoted as they are passed in by the user, so spaces (save for quoted spaces) are delimiters between paths. How would I forward these parameters to a subroutine within my bash script in a way that preserves the quoted spaces?
#! /bin/bash
for fname in "$#"; do
process-one-file-at-a-time "$fname"
done
Note the excessive use of quotes. It's all necessary.
Passing all the arguments to another program is even simpler:
process-all-together "$#"
The tricky case is when you want to split the arguments in half. That requires a lot more code in a simple POSIX shell. But maybe the Bash has some special features.
You want "$#", which has the special syntax of expanding $# but preserving the white-space quoting of the caller (it does not create a single giant string with all the arguments in it). So someone can call your script like:
bash-script.sh AFile "Another File With Spaces"
Then in your script you can do things like:
for f in "$#"; do
echo "$f";
done
and get two lines of output (not 5).
Read the paragraph about the Special Parameter "#" here: http://www.gnu.org/s/bash/manual/bash.html#Special-Parameters
Bravo #Roland . Thans a lot for your solution
It has really worked!
I wrote a simple script function that opens a given path with nautilus.
And I've just nested a function with this "helper"-for-loop into the main function:
fmp () {
fmp2() {
nautilus "$#";
};
for fname in "$#";
do
fmp2 "$fname";
done;
}
Now I'm able to make all my scripts work handling with paths just by turning them into nested functions wrapped by a function with this helper-for-loop.
"$var"
For example,
$ var='foo bar'
$ perl -E'say "<<$_>>" for #ARGV' $var
<<foo>>
<<bar>>
$ perl -E'say "<<$_>>" for #ARGV' "$var"
<<foo bar>>
This question already has answers here:
How can I preserve quotes in printing a bash script's arguments
(7 answers)
Closed 3 years ago.
I have a Bash script where I want to keep quotes in the arguments passed.
Example:
./test.sh this is "some test"
then I want to use those arguments, and re-use them, including quotes and quotes around the whole argument list.
I tried using \"$#\", but that removes the quotes inside the list.
How do I accomplish this?
using "$#" will substitute the arguments as a list, without re-splitting them on whitespace (they were split once when the shell script was invoked), which is generally exactly what you want if you just want to re-pass the arguments to another program.
Note that this is a special form and is only recognized as such if it appears exactly this way. If you add anything else in the quotes the result will get combined into a single argument.
What are you trying to do and in what way is it not working?
There are two safe ways to do this:
1. Shell parameter expansion: ${variable#Q}:
When expanding a variable via ${variable#Q}:
The expansion is a string that is the value of parameter quoted in a format that can be reused as input.
Example:
$ expand-q() { for i; do echo ${i#Q}; done; } # Same as for `i in "$#"`...
$ expand-q word "two words" 'new
> line' "single'quote" 'double"quote'
word
'two words'
$'new\nline'
'single'\''quote'
'double"quote'
2. printf %q "$quote-me"
printf supports quoting internally. The manual's entry for printf says:
%q Causes printf to output the corresponding argument in a format that can be reused as shell input.
Example:
$ cat test.sh
#!/bin/bash
printf "%q\n" "$#"
$
$ ./test.sh this is "some test" 'new
>line' "single'quote" 'double"quote'
this
is
some\ test
$'new\nline'
single\'quote
double\"quote
$
Note the 2nd way is a bit cleaner if displaying the quoted text to a human.
Related: For bash, POSIX sh and zsh: Quote string with single quotes rather than backslashes
Yuku's answer only works if you're the only user of your script, while Dennis Williamson's is great if you're mainly interested in printing the strings, and expect them to have no quotes-in-quotes.
Here's a version that can be used if you want to pass all arguments as one big quoted-string argument to the -c parameter of bash or su:
#!/bin/bash
C=''
for i in "$#"; do
i="${i//\\/\\\\}"
C="$C \"${i//\"/\\\"}\""
done
bash -c "$C"
So, all the arguments get a quote around them (harmless if it wasn't there before, for this purpose), but we also escape any escapes and then escape any quotes that were already in an argument (the syntax ${var//from/to} does global substring substitution).
You could of course only quote stuff which already had whitespace in it, but it won't matter here. One utility of a script like this is to be able to have a certain predefined set of environment variables (or, with su, to run stuff as a certain user, without that mess of double-quoting everything).
Update: I recently had reason to do this in a POSIX way with minimal forking, which lead to this script (the last printf there outputs the command line used to invoke the script, which you should be able to copy-paste in order to invoke it with equivalent arguments):
#!/bin/sh
C=''
for i in "$#"; do
case "$i" in
*\'*)
i=`printf "%s" "$i" | sed "s/'/'\"'\"'/g"`
;;
*) : ;;
esac
C="$C '$i'"
done
printf "$0%s\n" "$C"
I switched to '' since shells also interpret things like $ and !! in ""-quotes.
If it's safe to make the assumption that an argument that contains white space must have been (and should be) quoted, then you can add them like this:
#!/bin/bash
whitespace="[[:space:]]"
for i in "$#"
do
if [[ $i =~ $whitespace ]]
then
i=\"$i\"
fi
echo "$i"
done
Here is a sample run:
$ ./argtest abc def "ghi jkl" $'mno\tpqr' $'stu\nvwx'
abc
def
"ghi jkl"
"mno pqr"
"stu
vwx"
You can also insert literal tabs and newlines using Ctrl-V Tab and Ctrl-V Ctrl-J within double or single quotes instead of using escapes within $'...'.
A note on inserting characters in Bash: If you're using Vi key bindings (set -o vi) in Bash (Emacs is the default - set -o emacs), you'll need to be in insert mode in order to insert characters. In Emacs mode, you're always in insert mode.
I needed this for forwarding all arguments to another interpreter.
What ended up right for me is:
bash -c "$(printf ' %q' "$#")"
Example (when named as forward.sh):
$ ./forward.sh echo "3 4"
3 4
$ ./forward.sh bash -c "bash -c 'echo 3'"
3
(Of course the actual script I use is more complex, involving in my case nohup and redirections etc., but this is the key part.)
Like Tom Hale said, one way to do this is with printf using %q to quote-escape.
For example:
send_all_args.sh
#!/bin/bash
if [ "$#" -lt 1 ]; then
quoted_args=""
else
quoted_args="$(printf " %q" "${#}")"
fi
bash -c "$( dirname "${BASH_SOURCE[0]}" )/receiver.sh${quoted_args}"
send_fewer_args.sh
#!/bin/bash
if [ "$#" -lt 2 ]; then
quoted_last_args=""
else
quoted_last_args="$(printf " %q" "${#:2}")"
fi
bash -c "$( dirname "${BASH_SOURCE[0]}" )/receiver.sh${quoted_last_args}"
receiver.sh
#!/bin/bash
for arg in "$#"; do
echo "$arg"
done
Example usage:
$ ./send_all_args.sh
$ ./send_all_args.sh a b
a
b
$ ./send_all_args.sh "a' b" 'c "e '
a' b
c "e
$ ./send_fewer_args.sh
$ ./send_fewer_args.sh a
$ ./send_fewer_args.sh a b
b
$ ./send_fewer_args.sh "a' b" 'c "e '
c "e
$ ./send_fewer_args.sh "a' b" 'c "e ' 'f " g'
c "e
f " g
Just use:
"${#}"
For example:
# cat t2.sh
for I in "${#}"
do
echo "Param: $I"
done
# cat t1.sh
./t2.sh "${#}"
# ./t1.sh "This is a test" "This is another line" a b "and also c"
Param: This is a test
Param: This is another line
Param: a
Param: b
Param: and also c
Changed unhammer's example to use array.
printargs() { printf "'%s' " "$#"; echo; }; # http://superuser.com/a/361133/126847
C=()
for i in "$#"; do
C+=("$i") # Need quotes here to append as a single array element.
done
printargs "${C[#]}" # Pass array to a program as a list of arguments.
My problem was similar and I used mixed ideas posted here.
We have a server with a PHP script that sends e-mails. And then we have a second server that connects to the 1st server via SSH and executes it.
The script name is the same on both servers and both are actually executed via a bash script.
On server 1 (local) bash script we have just:
/usr/bin/php /usr/local/myscript/myscript.php "$#"
This resides on /usr/local/bin/myscript and is called by the remote server. It works fine even for arguments with spaces.
But then at the remote server we can't use the same logic because the 1st server will not receive the quotes from "$#". I used the ideas from JohnMudd and Dennis Williamson to recreate the options and parameters array with the quotations. I like the idea of adding escaped quotations only when the item has spaces in it.
So the remote script runs with:
CSMOPTS=()
whitespace="[[:space:]]"
for i in "$#"
do
if [[ $i =~ $whitespace ]]
then
CSMOPTS+=(\"$i\")
else
CSMOPTS+=($i)
fi
done
/usr/bin/ssh "$USER#$SERVER" "/usr/local/bin/myscript ${CSMOPTS[#]}"
Note that I use "${CSMOPTS[#]}" to pass the options array to the remote server.
Thanks for eveyone that posted here! It really helped me! :)
Quotes are interpreted by bash and are not stored in command line arguments or variable values.
If you want to use quoted arguments, you have to quote them each time you use them:
val="$3"
echo "Hello World" > "$val"
As Gary S. Weaver shown in his source code tips, the trick is to call bash with parameter '-c' and then quote the next.
e.g.
bash -c "<your program> <parameters>"
or
docker exec -it <my docker> bash -c "$SCRIPT $quoted_args"
If you need to pass all arguments to bash from another programming language (for example, if you'd want to execute bash -c or emit_bash_code | bash), use this:
escape all single quote characters you have with '\''.
then, surround the result with singular quotes
The argument of abc'def will thus be converted to 'abc'\''def'. The characters '\'' are interpreted as following: the already existing quoting is terminated with the first first quote, then the escaped singular single quote \' comes, then the new quoting starts.
Yes, seems that it is not possible to ever preserve the quotes, but for the issue I was dealing with it wasn't necessary.
I have a bash function that will search down folder recursively and grep for a string, the problem is passing a string that has spaces, such as "find this string". Passing this to the bash script will then take the base argument $n and pass it to grep, this has grep believing these are different arguments. The way I solved this by using the fact that when you quote bash to call the function it groups the items in the quotes into a single argument. I just needed to decorate that argument with quotes and pass it to the grep command.
If you know what argument you are receiving in bash that needs quotes for its next step you can just decorate with with quotes.
Just use single quotes around the string with the double quotes:
./test.sh this is '"some test"'
So the double quotes of inside the single quotes were also interpreted as string.
But I would recommend to put the whole string between single quotes:
./test.sh 'this is "some test" '
In order to understand what the shell is doing or rather interpreting arguments in scripts, you can write a little script like this:
#!/bin/bash
echo $#
echo "$#"
Then you'll see and test, what's going on when calling a script with different strings