I'm trying to make a little script as a wrapper around this command:
$ egrep target /usr/lusers/me/test/*test.txt
/usr/lusers/me/test/1test.txt:target
That directory has files called 1test.txt and 2test.txt, one of which contains some text I want to find.
Here is my whole script, called mygrep.sh:
set -v
set -x
egrep "$1" '/usr/lusers/me/test/*test.txt'
Here's the output:
$ ./mygrep.sh target
set -x
egrep "$1" '/usr/lusers/me/test/*test.txt'
++ egrep targ '/usr/lusers/me/test/*test.txt'
egrep: /usr/lusers/me/test/*test.txt: No such file or directory
Note the 's around the file path in the set -x output, and that the command fails.
Now compare this variation of the script:
set -v
set -x
egrep "$1" '/usr/lusers/me/test/1test.txt'
Note that the only difference is the asterisk vs the literal file name.
Output:
$ ./mygrep.sh target
set -x
egrep "$1" '/usr/lusers/me/test/1test.txt'
++ egrep target /usr/lusers/me/test/1test.txt
target
No single quotes after expansion, and the command works.
So why are those single quotes added when there's an asterisk, and why is the command failing in that case?
The output resulting from set -x is for debugging purposes. No quotes are added to the argument; they are just for display purposes.
The correct command is egrep "$1" /usr/lusers/me/test/*.test.txt, because the shell must expand the pattern (if possible) before passing the results to egrep. You don't have an actual file named *.test.txt.
The globbing character must be outside (single or double) quotes, as quotes disable globbing.
Use this instead :
egrep "$1" '/usr/lusers/me/test/'*'test.txt'
Or this :
egrep "$1" "/usr/lusers/me/test/"*"test.txt"
Or, since there is nothing inside this specific pattern that would cause word splitting to occur (but that would not be a generally safe way to do it if the path is not known safe in advance) :
egrep "$1" /usr/lusers/me/test/*test.txt
Related
I am trying to find and replace a specific text content using the sed command and to run it via a shell script.
Below is the sample script that I am using:
fp=/asd/filename.txt
fd="sed -i -E 's ($2).* $2:$3 g' ${fp}"
eval $fd
and executing the same by passing the arguments:
./test.sh update asd asdfgh
But if the argument string contains $ , it breaks the commands and it is replacing with wrong values, like
./test.sh update asd $apr1$HnIF6bOt$9m3NzAwr.aG1Yp.t.bpIS1.
How can I make sure that the values inside the variables are not expanded because of the $?
Updated
sh file test.sh
set -xv
fp="/asd/filename.txt"
sed -iE "s/(${2//'$'/'\$'}).*/${2//'$'/'\$'}:${3//'$'/'\$'}/g" "$fp"
text file filename.txt
hello:world
Outputs
1)
./test.sh update hello WORLD
sed -iE "s/(${2//'$'/'\$'}).*/${2//'$'/'\$'}:${3//'$'/'\$'}/g" "$fp"
++ sed -iE 's/(hello).*/hello:WORLD/g' /asd/filename.txt
2)
./test.sh update hello '$apr1$hosgaxyv$D0KXp5dCyZ2BUYCS9BmHu1'
sed -iE "s/(${2//'$'/'\$'}).*/${2//'$'/'\$'}:${3//'$'/'\$'}/g" "$fp"
++ sed -iE 's/(hello).*/hello:'\''$'\''apr1'\''$'\''hosgaxyv'\''$'\''D0KXp5dCyZ2BUYCS9BmHu1/g' /asd/filename.txt
In both the case , its not replacing the content
You don't need eval here at all:
fp=/asd/filename.txt
sed -i -E "s/(${2//'$'/'\$'}).*/\1:${3//'$'/'\$'}/g" "$fp"
The whole sed command is in double quotes so variables can expand.
I've replaced the blank as the s separator with / (doesn't really matter in the example).
I've used \1 to reference the first capture group instead of repeating the variable in the substitution.
Most importantly, I've used ${2//'$'/'\$'} instead of $2 (and similar for $3). This escapes every $ sign as \$; this is required because of the double quoting, or the $ get eaten by the shell before sed gets to see them.
When you call your script, you must escape any $ in the input, or the shell tries to expand them as variable names:
./test.sh update asd '$apr1$HnIF6bOt$9m3NzAwr.aG1Yp.t.bpIS1.'
Put the command-line arguments that are filenames in single quotes:
./test.sh update 'asd' '$apr1$HnIF6bOt$9m3NzAwr.aG1Yp.t.bpIS1'
must protect all the script arguments with quotes if having space and special shell char, and escape it if it's a dollar $, and -Ei instead of -iE even better drop it first for test, may add it later if being really sure
I admit i won't understant your regex so let's just get in the gist of solution, no need eval;
fp=/asd/filename.txt
sed -Ei "s/($2).*/$2:$3/g" $fp
./test.sh update asd '\$apr1\$HnIF6bOt\$9m3NzAwr.aG1Yp.t.bpIS1.'
How can I nest operations in bash? e.g I know that
$(basename $var)
will give me just the final part of the path and
${name%.*}
gives me everything before the extension.
How do I combine these two calls, I want to do something like:
${$(basename $var)%.*}
As #sid-m 's answer states, you need to change the order of the two expansions because one of them (the % stuff) can only be applied to variables (by giving their name):
echo "$(basename "${var%.*}")"
Other things to mention:
You should use double quotes around every expansion, otherwise you run into trouble if you have spaces in the variable values. I already did that in my answer.
In case you know or expect a specific file extension, basename can strip that off for you as well: basename "$var" .txt (This will print foo for foo.txt in $var.)
You can do it like
echo $(basename ${var%.*})
it is just the order that needs to be changed.
Assuming you want to split the file name, here is a simple pattern :
$ var=/some/folder/with/file.ext
$ echo $(basename $var) | cut -d "." -f1
file
If you know the file extension in advance, you can tell basename to remove it, either as a second argument or via the -s option. Both these yield the same:
basename "${var}" .extension
basename -s .extension "${var}"
If you don't know the file extension in advance, you can try to grep the proper part of the string.
### grep any non-slash followed by anything ending in dot and non-slash
grep -oP '[^/]*(?=\.[^/]*$)' <<< "${var}"
Im running the below script to run some checks on file paths within a function
Some of the paths contains single quotes and the temporary files contain "$"
If I enclose the variable in single quotes (name variable below) then the string in truncated if there is a ' in the file path. If I use a double quote around the filepath then it truncates if there is a "$" in the path.
Is there any way out of this circular conundrum?
File=/root/fed/~$reader.txt
Is echoing as if there is a $ in the file path
/root/fed/eager.txt
if there is a ' in the file path and I enclose around single quote (to stop the above from happening) then
File='/root/fed/reader's'
(this wont echo)
Code is :
find / -type f -print0 | xargs -0 -I name bash -c "fnchash 'name'"
fnchash () {
echo "$1"
}
Single-quoted strings may not contain single quotes. Not even by escaping them. Unquoted or double-quoted and unescaped $ introduces a variable reference that expands to the value of the referenced shell variable, or to nothing if no such variable is defined.
One solution involves double quotes and escape characters. For example:
File=/root/fed/~\$reader.txt
File=/root/fed/reader\'s
File="/root/fed/\$reader's.txt"
Note, too, that quotes are not necessarily string delimiters -- they are substring delimiters. Thus, these work, too:
File=/root/fed/~'$'reader.txt
File=/root/fed/reader"'"s
If you need to perform automatic quoting of data read at runtime, then you should also be aware of bash's built-in printf command (which is more featureful than what you may have as a standalone command). Note in particular that the %q field descriptor performs all needed quoting on its arguments to make them re-usable as shell input.
printf '%q' $File
The way out of this conundrum is to stop trying to treat data as code. This snippet passes filenames as data (arguments) instead of trying to inject them into a code string, thereby preventing these issues entirely:
$ cat myscript
fnchash () {
for arg
do
printf "Now processing: %s\n" "$arg"
done
}
export -f fnchash
find . -type f -exec bash -c 'fnchash "$#"' _ {} +
$ ls -1
File with space and '.txt
myscript
some file.txt
$ bash myscript
Now processing: ./some file.txt
Now processing: ./myscript
Now processing: ./File with space and '.txt
This uses the fact that you can call bash -c 'command' argv0 argv1 argv2.. to run 'command' with the positional parameters $0, $1, $2.. set to arguments passed to bash and process them as if they were arguments to a script (man bash under -c).
-exec bash -c 'command' _ {} + is then used to run command with $0 set to a dummy value _ and the rest of the parameters set to filenames that find finds (man find under -exec).
The bash command can then process these arguments just like any script would.
The same technique can also be used with xargs, here to parallelize the process 5 ways in chunks of 20 files:
find . -type f -print0 | xargs -0 -P 5 -n 20 bash -c 'fnchash "$#"' _
You can solve this by using single quotes and replacing all single quotes with escaped ones,
Or
Using double quotes and replacing all $ with escaped ones.
I am trying to write a bash script so that I will use to replace my egrep command. I want to be able to take the exact same input that is given to my script and feed it to egrep.
i.e.
#!/bin/bash
PARAMS=$#
`egrep "$PARAMS"`
But I have noticed that if I echo what I am executing, that the quotes have been removed as follows:
./customEgrep -nr "grep my ish" *
returns
egrep -nr grep my ish (file list from the expanded *)
Is there a way that I can take the input literally so I can use it directly with egrep?
You want this:
egrep "$#"
The quotes you type are not passed to the script; they're used to determine word boundaries. Using "$#" preserves those word boundaries, so egrep will get the same arguments as it would if you ran it directly. But you still won't see quotation marks if you echo the arguments.
" is a special char. you need to use escape character in order to retrieve "
use
./customEgrep -nr "\"grep my ish\"" *
If you don't need to do any parameter expansion in the argument, you can use
single quotes to avoid the need to escape the double quotes:
./customerEgrep -nr '"grep my ish"' *
$# is special when quoted. Try:
value=$( egrep "$#" )
It's not clear to me why you are using bacticks and ignoring the result, so I've used the $() syntax and assigned the value.
If for some reason you want to save the parameters to use later, you can also do things like:
for i; do args="$args '$i'"; done # Save the arguments
eval grep $args # Pass the arguments to grep without resetting $1,$2,...
eval set $args # Restore the arguments
grep "$#" # Use the restored arguments
I am trying to do something like "copy the newest file in a directory." I have come up the following command simple command using backticks, which works fine for filenames without embedded white space:
cp -rp `ls -1d searchstring | head -1` destination
As you can see this should work fine when the returned file has no space within it. However, this will obviously not work when there is such a space.
I need either a way to handle the output of the backticks, or some alternate approach.
You can treat the result of the command substitution as a single word by adding double quotes around it:
cp -rp "`ls -t searchstring | head -n 1`" destination
The double quotes are not needed when assigning to a variable. a=`uptime` is equivalent to a="`uptime`".