I created a bash script to thumbnail all images in a tree. It is the following:
#!/bin/bash
find -path "thumbnails/" -prune -or -iname "*.jpg" -exec \
bash -c 'convert "$0" -resize 256x256 thumbnails/`sha512sum "$0" | awk "{ print \\$1 }"`.jpg' {} \;
# ^^
In the awk command, there is a double \\. (I've marked it with ^^ on the commented line, but you'll probably need to scroll →) Why do I need two backslashes here? I need one to prevent the shell from attempting to expand $1, but otherwise, we are working within just a single set of single-quotes, which shouldn't be messing with the number of slashes. Yet, with just one backslash, awk { print } gets executed, which isn't correct.
Why the \\?
Why do I need two backslashes here? I need one to prevent the shell from attempting to expand $1, but otherwise, we are working within just a single set of single-quotes, which shouldn't be messing with the number of slashes.
There are actually two shells here which do all the usual variable/path substitions/expansions, one is
bash -c
the other is the backtick operator:
`command`
You need another backlash to prevent expanding of $1.
You want to run
convert "$0" -resize 256x256 thumbnails/`sha512sum "$0" | awk "{ print \$1 }"`.jpg {}
through a bash -c for each file. So when you add the command as an argument of bash, you wrap it with single quote ''. In which case you need to escape the backslash with one more backslash. That is
bash -c 'convert "$0" -resize 256x256 thumbnails/`sha512sum "$0" | awk "{ print \\$1 }"`.jpg ' {}
A double backslash ensures that a single backslash will be part of what is printed, in other words if $1 holds the value "xyz" the script will print \xyz
Related
I have being trying to write a bash script that can search recursively in a directory and replace multiple strings e.g. #{DEMO_STRING_1} etc with an environment variable e.g. $sample1.
Full script:
#!/bin/sh
find /my/path/here -type f -name '*.js' -exec sed -i \
-e 's/#{DEMO_STRING_1}/'"$sample1"'/g' \
-e 's/#{DEMO_STRING_2}/'"$sample2"'/g' \
-e 's/#{DEMO_STRING_3}/'"$sample3"'/g' \
-e 's/#{DEMO_STRING_4}/'"$sample4"'/g' \
-e 's/#{DEMO_STRING_5}/'"$sample5"'/g' \
-e 's/#{DEMO_STRING_6}/'"$sample6"'/g' \
-e 's/#{DEMO_STRING_7}/'"$sample7"'/g' \
-e 's/#{DEMO_STRING_8}/'"$sample8"'/g' \
{} +
I can not figure out how to replace strings with hashtag with curly brackets.
I tried this example: sed find and replace with curly braces or Environment variable substitution in sed but I can not figure out how to combine them.
What I am missing? I searched also for characters that need to be escaped e.g. What characters do I need to escape when using sed in a sh script? but again not the characters that I need.
The specific format is throwing the following error:
sed: bad option in substitution expression
Where am I going so wrong?
Update: Sample of environment variables:
https://www.example.com
/sample string/
12345-abcd-54321-efgh
base64 string
All the cases above are environment variables that I would like to replace. All environment variables are within double quotes.
It is important to understand that the environment variable references are expanded by the shell, as it prepares to execute the command, not by the command itself (sed in this case). The command sees only the results of the expansions.
In your case, that means that if any of the environment variables' values contain characters that are meaningful to sed in context, such as unescaped (to sed) slashes (/), then sed will attribute special significance to them instead of interpreting them as ordinary characters. For example, given a sed command such as
sed -e "s/X/${var}/" <<EOF
Replacement: X
EOF
, if the value of $var is Y then the output will be
Replacement: Y
, but if the value of $var is /path/to/Y then sed will fail with the same error you report. This happens because the sed command actually run is the same as if you had typed
sed -e s/X//path/to/Y
, which contains an invalid s instruction. Probably the best alternative would be to escape the replacement-string characters that otherwise would be significant to sed. You can do that by interposing a shell function:
escape_replacement() {
# replace all \ characters in the first argument with double backslashes.
# Note that in order to do that here, we need to escape them from the shell
local temp=${1//\\/\\\\}
# Replace all & characters with \&
temp=${temp//&/\\&}
# Replace all / characters with \/, and write the result to standard out.
# Use printf instead of echo to avoid edge cases in which the value to print
# is interpreted to be or start with an option.
printf -- "%s" "${temp//\//\\/}"
}
Then the script would use it like this:
find /my/path/here -type f -name '*.js' -exec sed -i \
-e 's/#{DEMO_STRING_1}/'"$(escape_replacement "$sample1")"'/g' \
...
Note that you probably also want to use a shebang line that explicitly specifies a shell that supports substitution references (${parameter/pattern/replacement}), because these are not required by POSIX, and you might run into a system where /bin/sh is a shell that does not support them. If you're willing to rely on Bash then that should be reflected in your shebang line. Alternatively, you could prepare a version of the escape_replacement function that does not rely on substitution references.
If you use perl - you don't need to escape anything.
With your shell variable exported you can access it via $ENV{name} inside perl.
examples:
samples=(
https://www.example.com
'/sample string/'
12345-abcd-54321-efgh
'base64 string'
$'multi\nline'
)
for sample in "${samples[#]}"
do
echo '---'
export sample
echo 'A B #{DEMO_STRING_1} C' |
perl -pe 's/#{DEMO_STRING_1}/$ENV{sample}/g'
done
echo '---'
Output:
---
A B https://www.example.com C
---
A B /sample string/ C
---
A B 12345-abcd-54321-efgh C
---
A B base64 string C
---
A B multi
line C
---
To add the -i option you can: perl -pi -e 's///'
Below is a script and its output describing the problem I found today. Even though ls output is quoted, bash still breaks at the whitespaces. I changed to use for file in *.txt, just want to know why bash behaves this way.
[chau#archlinux example]$ cat a.sh
#!/bin/bash
FILES=$(ls --quote-name *.txt)
echo "Value of \$FILES:"
echo $FILES
echo
echo "Loop output:"
for file in $FILES
do
echo $file
done
[chau#archlinux example]$ ./a.sh
Value of $FILES:
"b.txt" "File with space in name.txt"
Loop output:
"b.txt"
"File
with
space
in
name.txt"
Why bash ignored the quotation in ls output?
Because word splitting happens on the result of variable expansion.
When evaluating a statement the shell goes through different phases, called shell expansions. One of these phases is "word splitting". Word splitting literally does split your variables into separate words, quoting from the bash manual:
The shell scans the results of parameter expansion, command substitution, and arithmetic expansion that did not occur within double quotes for word splitting.
The shell treats each character of $IFS as a delimiter, and splits the results of the other expansions into words using these characters as field terminators. . If IFS is unset, or its value is exactly <space><tab><newline>, the default, then sequences of <space>, <tab>, and <newline> at the beginning and end of the results of the previous expansions are ignored, and any sequence of IFS characters not at the beginning or end serves to delimit words. ...
When shell has a $FILES, that is not within double quotes, it firsts does "parameter expansion". It expands $FILES to the string "b.txt" "File with space in name.txt". Then word splitting occurs. So with the default IFS, the resulting string is split/separated on spaces, tabs or newlines.
To prevent word splitting the $FILES has to be inside double quotes itself, no the value of $FILES.
Well, you could do this (unsafe):
ls -1 --quote-name *.txt |
while IFS= read -r file; do
eval file="$file"
ls -l "$file"
done
tell ls to output newline separated list -1
read the list line by line
re-evaulate the variable to remove the quotes with evil. I mean eval
I use ls -l "$file" inside the loop to check if "$file" is a valid filename.
This will still not work on all filenames, because of ls. Filenames with unreadable characters are just ignored by my ls, like touch "c.txt"$'\x01'. And filenames with embedded newlines will have problems like ls $'\n'"c.txt".
That's why it's advisable to forget ls in scripts - ls is only for nice-pretty-printing in your terminal. In scripts use find.
If your filenames have no newlines embedded in them, you can:
find . -mindepth 1 -maxdepth 1 -name '*.txt' |
while IFS= read -r file; do
ls -l "$file"
done
If your filenames are just anything, use a null-terminated stream:
find . -mindepth 1 -maxdepth 1 -name '*.txt' -print0 |
while IFS= read -r -d'' file; do
ls -l "$file"
done
Many, many unix utilities (grep -z, xargs -0, cut -z, sort -z) come with support for handling zero-terminated strings/streams just for handling all the strange filenames you can have.
You can try the follwing snippet:
#!/bin/bash
while read -r file; do
echo "$file"
done < <(ls --quote-name *.txt)
Im running the below script to run some checks on file paths within a function
Some of the paths contains single quotes and the temporary files contain "$"
If I enclose the variable in single quotes (name variable below) then the string in truncated if there is a ' in the file path. If I use a double quote around the filepath then it truncates if there is a "$" in the path.
Is there any way out of this circular conundrum?
File=/root/fed/~$reader.txt
Is echoing as if there is a $ in the file path
/root/fed/eager.txt
if there is a ' in the file path and I enclose around single quote (to stop the above from happening) then
File='/root/fed/reader's'
(this wont echo)
Code is :
find / -type f -print0 | xargs -0 -I name bash -c "fnchash 'name'"
fnchash () {
echo "$1"
}
Single-quoted strings may not contain single quotes. Not even by escaping them. Unquoted or double-quoted and unescaped $ introduces a variable reference that expands to the value of the referenced shell variable, or to nothing if no such variable is defined.
One solution involves double quotes and escape characters. For example:
File=/root/fed/~\$reader.txt
File=/root/fed/reader\'s
File="/root/fed/\$reader's.txt"
Note, too, that quotes are not necessarily string delimiters -- they are substring delimiters. Thus, these work, too:
File=/root/fed/~'$'reader.txt
File=/root/fed/reader"'"s
If you need to perform automatic quoting of data read at runtime, then you should also be aware of bash's built-in printf command (which is more featureful than what you may have as a standalone command). Note in particular that the %q field descriptor performs all needed quoting on its arguments to make them re-usable as shell input.
printf '%q' $File
The way out of this conundrum is to stop trying to treat data as code. This snippet passes filenames as data (arguments) instead of trying to inject them into a code string, thereby preventing these issues entirely:
$ cat myscript
fnchash () {
for arg
do
printf "Now processing: %s\n" "$arg"
done
}
export -f fnchash
find . -type f -exec bash -c 'fnchash "$#"' _ {} +
$ ls -1
File with space and '.txt
myscript
some file.txt
$ bash myscript
Now processing: ./some file.txt
Now processing: ./myscript
Now processing: ./File with space and '.txt
This uses the fact that you can call bash -c 'command' argv0 argv1 argv2.. to run 'command' with the positional parameters $0, $1, $2.. set to arguments passed to bash and process them as if they were arguments to a script (man bash under -c).
-exec bash -c 'command' _ {} + is then used to run command with $0 set to a dummy value _ and the rest of the parameters set to filenames that find finds (man find under -exec).
The bash command can then process these arguments just like any script would.
The same technique can also be used with xargs, here to parallelize the process 5 ways in chunks of 20 files:
find . -type f -print0 | xargs -0 -P 5 -n 20 bash -c 'fnchash "$#"' _
You can solve this by using single quotes and replacing all single quotes with escaped ones,
Or
Using double quotes and replacing all $ with escaped ones.
In the below shell script I try to print A2D(Vlog-Ams-#Cross) with special characters escaped. For example replace ( with \( but sed won't have any effect.
#! /bin/sh
parameter="A2D(Vlog-Ams-#Cross)"
echo $parameter
parameterz=`echo "$parameter" | sed 's/(/\\(/g'`
echo $parameterz
The output is
A2D(Vlog-Ams-#Cross)
A2D(Vlog-Ams-#Cross)
If I do the same on my c-shell terminal, it works fine.
Any ideas?
You use backslashs within a backtick command and that's tricky. If the sed command didn't occur within backticks, it would work correctly. When the shell looks for the closing backtick, however, it removes one level of backslash quoting, so you get
sed 's/(/\(/g'
and that's a no-op. If your shell permits it, use $(...) instead of backticks; in this way you avoid these quoting problems.
In your replacement \\( the first \ escapes the second \. But you must escape the (, too:
$ echo 'A2D(Vlog-Ams-#Cross)' | sed -e 's/(/\\\(/g' -e 's/)/\\\)/g'
A2D\(Vlog-Ams-#Cross\)
I am trying to do something like "copy the newest file in a directory." I have come up the following command simple command using backticks, which works fine for filenames without embedded white space:
cp -rp `ls -1d searchstring | head -1` destination
As you can see this should work fine when the returned file has no space within it. However, this will obviously not work when there is such a space.
I need either a way to handle the output of the backticks, or some alternate approach.
You can treat the result of the command substitution as a single word by adding double quotes around it:
cp -rp "`ls -t searchstring | head -n 1`" destination
The double quotes are not needed when assigning to a variable. a=`uptime` is equivalent to a="`uptime`".