Echo without interpreting newline - makefile

I have a makefile where one of the actions requires that I write a string \noslide to a textfile called modeset.txt, so I'm using 'echo'. Although the -E flag should disable interpretation of backslash escapes echo processes the flag as part of the sting (see output below).
Here is a stripped down snippet from my makefile:
.PHONY: target all
all: target
target:
echo -E "\noslide" > modeset.txt
clean:
rm -f modeset.txt
The content of the modeset.txt file shows that echo gobbled the -E as part of the string.
-E
oslide
In a shell however, echo -e and echo -E work as expected, so what is the makefile environment doing to cause this?
OS: Vanilla Debian 8 from official sources only.

echo is not portable, beyond simple text. Many different versions exist that behave differently, and even within the same system it can depend on what shell you use (i.e., the shell's builtin echo vs the system's /bin/echo).
If you want to display anything other than simple text followed by a newline, you should use printf not echo:
target:
printf '\\noslide\n' > modeset.txt

Related

How to pass a Bash command to `entr`, quoting to guard against filenames with spaces?

My Goal
I'm writing a small Bash script, which uses entr, which is a utility to re-run arbitrary commands when it detects file-system events. My immediate goal is to pass entr a command which converts a given markdown file to HTML. entr will run this command every time the markdown file changes. A simplified but working script looks like:
# script 1
in="$1"
out="${in%.md}.html"
echo "$in" | entr pandoc "${in}" -o "${out}"
This works fine. The filename to be watched is supplied to entr on stdin. On detecting changes in that file, entr runs the command specified by its args. In this example that is pandoc, and all the args after it, to convert the markdown file to an HTML file.
For future reference, set -x shows that entr was invoked as we'd expect. (Throughout, lines starting with + show the output from set -x):
+ entr pandoc 'READ ME.md' -o 'READ ME.html'
The problem
I want to look-up the command given to entr depending on the file-type of the
given input file. So the file-conversion command ends up in a variable, and I want to use that variable as the command-line args to entr. But I can't get the quoting right.
Again, simplified:
# script 2
in="$1"
out="${in%.md}.html"
cmd="pandoc \"${in}\" -o \"${out}\""
echo "$in" | entr "$cmd"
(shellcheck.net detects no issues on the above)
This fails. Because "$cmd" in the final line is in quotes, the entirety of $cmd
is treated as a single arg to entr:
+ entr 'pandoc "READ ME.md" -o "READ ME.html"'
entr tries to interpret the whole thing as the name of an executable, which
it cannot find:
entr: exec pandoc "READ ME.md" -o "READ ME.html": No such file or directory
So how should I modify script 2, to use the content of $cmd as the args to
entr?
What have I tried?
Check that $cmd is being formed as I expect? If I echo "$cmd" right after
it is defined in script 2, it looks exactly how I'd hope:
pandoc "READ ME.md" -o "READ ME.html"
I tried messing around with alternate ways of constructing cmd, such as:
cmd='pandoc "'"${in}"'" -o "'"${out}"'"'
but variations like this produce identical values of $cmd, and identical
behavior as script2.
Try not quoting the use of $cmd?
Since the final line of script 2 erroneously treats the whole of "$cmd"
as a single arg, and we want it to split up the words into seprate args
instead, maybe removing the quotes and using a bare $cmd is a step in the
right direction?
echo "$in" | entr $cmd
Predictably enough though, this splits $cmd up on every space, even the
ones inside our double-quotes:
+ entr pandoc '"READ' 'ME.md"' -o '"READ' 'ME.html"'
This makes Pandoc try, and fail, to open a file called "READ:
pandoc: "READ: openBinaryFile: does not exist (No such file or directory)
Try constructing $cmd using printf?
I notice printf -v can store output in a variable. How about using that
instead of assiging to cmd?
printf -v cmd 'pandoc "%s" -o "%s"' "$in" "$out"
Predictably enough, this produces the same results as script2. I tried some
speculative variations, such as %q in the format string, or using $in
and $out directly in the format string, but didn't stumble on anything
that seemed to help.
Try using the ${var#Q} form of parameter expansion.
echo "$in" | entr ${cmd#Q}
Tried with and without double quotes around the use of ${cmd#q}. No joy,
I guess I'm misunderstanding what #Q is for.
+ entr ''\''pandoc' '"READ' 'ME.md"' -o '"READ' 'ME.html"'\'''
entr: exec 'pandoc: No such file or directory
Details
I'm using Bash v5.1.16, in Pop!_OS 22.04, derived from Ubuntu 22.04 (Jammy).
The current 'apt' version of entr (v5.1) in Ubuntu Jammy (22.04) is too old
for my needs (e.g. the -z flag doesn't work.) so I'm compiling my own from
the latest v5.3 source release.
I know there are a lot of questions about quoting in Bash, but I don't see any that seem to match this. Apologies if I'm wrong.
Assemble the command as an array, instead of a string.
I read somewhere that maybe $# might do what I need, so I put the parts of $cmd into an array:
in="$1"
out="${in%.md}.html"
cmd=(pandoc "$in" -o "$out")
echo "$in" | entr "${cmd[#]}"
This correctly quotes the items in ${cmd[#]} which require it (e.g. have spaces in.)
+ entr pandoc 'READ ME.md' -o 'READ ME.html'
So ‘entr’ successfully calls ‘pandoc’, which successfully converts the documents. It works! I confess I did not expect that.
This approach seems viable for other similar situations, not just when invoking entr.
So I have a solution. It doesn't seem completely ideal for my future plans. I had visions of these 'file conversion commands' being configurable, and hence defined in a text file somewhere, so that users (==me, probably) could override them and define their own, and I'm not fluent enough with Bash to be sure how to go about that when commands are defined as arrays instead of strings.
I can't help but feel I've overlooked something simpler.
Use a shell to interpret the value of "$cmd":
echo "$in" | entr sh -c "$cmd"
This approach seems viable for other similar situations, not just when invoking entr.
Similarly, entr has a -s option which invokes a shell for you (chosen using the first word in $SHELL):
echo "$in" | entr -s "$cmd"
These both work well, at the minor cost of spawning an extra shell process.

How do I ignore a byte order marker from a while read loop in zsh

I need to verify that all images mentioned in a csv are present inside a folder. I wrote a small shell script for that
#!/bin/zsh
red='\033[0;31m'
color_Off='\033[0m'
csvfile=$1
imgpath=$2
cat $csvfile | while IFS=, read -r filename rurl
do
if [ -f "${imgpath}/${filename}" ]
then
echo -n
else
echo -e "$filename ${red}MISSING${color_Off}"
fi
done
My CSV looks something like
Image1.jpg,detail-1
Image2.jpg,detail-1
Image3.jpg,detail-1
The csv was created by excel.
Now all 3 images are present in imgpath but for some reason my output says
Image1.jpg MISSING
Upon using zsh -x to run the script i found that my CSV file has a BOM at the very beginning making the image name as \ufeffImage1.jpg which is causing the whole issue.
How can I ignore a BOM(byte-order marker) in a while read operation?
zsh provides a parameter expansion (also available in POSIX shells) to remove a prefix: ${var#prefix} will expand to $var with prefix removed from the front of the string.
zsh also, like ksh93 and bash, supports ANSI C-like string syntax: $'\ufeff' refers to the Unicode sequence for a BOM.
Combining these, one can refer to ${filename#$'\ufeff'} to refer to the content of $filename but with the Unicode sequence for a BOM removed if it's present at the front.
The below also makes some changes for better performance, more reliable behavior with odd filenames, and compatibility with non-zsh shells.
#!/bin/zsh
red='\033[0;31m'
color_Off='\033[0m'
csvfile=$1
imgpath=$2
while IFS=, read -r filename rurl; do
filename=${filename#$'\ufeff'}
if ! [ -f "${imgpath}/${filename}" ]; then
printf '%s %bMISSING%b\n' "$filename" "$red" "$color_Off"
fi
done <"$csvfile"
Notes on changes unrelated to the specific fix:
Replacing echo -e with printf lets us pick which specific variables get escape sequences expanded: %s for filenames means backslashes and other escapes in them are unmodified, whereas %b for $red and $color_Off ensures that we do process highlighting for them.
Replacing cat $csvfile | with < "$csvfile" avoids the overhead of starting up a separate cat process, and ensures that your while read loop is run in the same shell as the rest of your script rather than a subshell (which may or may not be an issue for zsh, but is a problem with bash when run without the non-default lastpipe flag).
echo -n isn't reliable as a noop: some shells print -n as output, and the POSIX echo standard, by marking behavior when -n is present as undefined, permits this. If you need a noop, : or true is a better choice; but in this case we can just invert the test and move the else path into the truth path.

/bin/sh: -c: line 1: syntax error: unexpected end of file in bash [duplicate]

Considering that every command is run in its own shell, what is the best way to run a multi-line bash command in a makefile? For example, like this:
for i in `find`
do
all="$all $i"
done
gcc $all
You can use backslash for line continuation. However note that the shell receives the whole command concatenated into a single line, so you also need to terminate some of the lines with a semicolon:
foo:
for i in `find`; \
do \
all="$$all $$i"; \
done; \
gcc $$all
But if you just want to take the whole list returned by the find invocation and pass it to gcc, you actually don't necessarily need a multiline command:
foo:
gcc `find`
Or, using a more shell-conventional $(command) approach (notice the $ escaping though):
foo:
gcc $$(find)
As indicated in the question, every sub-command is run in its own shell. This makes writing non-trivial shell scripts a little bit messy -- but it is possible! The solution is to consolidate your script into what make will consider a single sub-command (a single line).
Tips for writing shell scripts within makefiles:
Escape the script's use of $ by replacing with $$
Convert the script to work as a single line by inserting ; between commands
If you want to write the script on multiple lines, escape end-of-line with \
Optionally start with set -e to match make's provision to abort on sub-command failure
This is totally optional, but you could bracket the script with () or {} to emphasize the cohesiveness of a multiple line sequence -- that this is not a typical makefile command sequence
Here's an example inspired by the OP:
mytarget:
{ \
set -e ;\
msg="header:" ;\
for i in $$(seq 1 3) ; do msg="$$msg pre_$${i}_post" ; done ;\
msg="$$msg :footer" ;\
echo msg=$$msg ;\
}
The ONESHELL directive allows to write multiple line recipes to be executed in the same shell invocation.
all: foo
SOURCE_FILES = $(shell find . -name '*.c')
.ONESHELL:
foo: ${SOURCE_FILES}
FILES=()
for F in $^; do
FILES+=($${F})
done
gcc "$${FILES[#]}" -o $#
There is a drawback though : special prefix characters (‘#’, ‘-’, and ‘+’) are interpreted differently.
https://www.gnu.org/software/make/manual/html_node/One-Shell.html
Of course, the proper way to write a Makefile is to actually document which targets depend on which sources. In the trivial case, the proposed solution will make foo depend on itself, but of course, make is smart enough to drop a circular dependency. But if you add a temporary file to your directory, it will "magically" become part of the dependency chain. Better to create an explicit list of dependencies once and for all, perhaps via a script.
GNU make knows how to run gcc to produce an executable out of a set of .c and .h files, so maybe all you really need amounts to
foo: $(wildcard *.h) $(wildcard *.c)
What's wrong with just invoking the commands?
foo:
echo line1
echo line2
....
And for your second question, you need to escape the $ by using $$ instead, i.e. bash -c '... echo $$a ...'.
EDIT: Your example could be rewritten to a single line script like this:
gcc $(for i in `find`; do echo $i; done)

Makefile does not call shell script when setting variable

I am trying to follow the meaty skeleton tutorial on osdev. The Makefile is not running one of the shell scripts. I have set all of the permissions on each of the files to be executable.
In lib/Makefile, I have the below few lines set:
$(info DEFAULT_HOST!=../default-host.sh)
$(info HOST?=DEFAULT_HOST)
$(info HOSTARCH!=../target-triplet-to-arch.sh $(HOST))
after these lines have executed, neither DEFAULT_HOST nor HOSTARCH get set.
default-host.sh:
#!/bin/sh
echo i686-elf
arget-triplet-to-arch.sh:
#!/bin/sh
if echo "$1" | grep -Eq 'i[[:digit:]]86-'; then
touch here.txt
echo i386
else
touch there.txt
echo "$1" | grep -Eo '^[[:alnum:]_]*'
fi
Note, I added the touch statements in arget-triplet-to-arch.sh. When run from the shell, one or other of those files is created, but not when the Makefile is run. This means that make seems to not be running the shell commands. How can I get make to run the shell commands?
As Beta says, info doesn't "allow you to see the value of that line being evaluated". info expands its argument then prints it to stdout. "Expands" means it resolves any variable references, it doesn't mean interpreting it as a makefile command. So if you run $(info hi) it prints "hi". If you run $(info foo = bar) if prints foo = bar but does not set the value of the variable foo to bar.
For using !=, note that this feature was added to GNU make 4.0. If your version is older than that then this assignment doesn't do what you expect. In particular, a line like FOO!=echo bar will be interpreted as if it were FOO! = echo bar... in other words it sets the make variable named FOO!.
Personally I always put whitespace around the assignment statements in my makefiles... this makes it clear that they are make assignments, not shell variable assignments (not that it shouldn't be clear anyway for anyone who knows makefile syntax, but...). In newer versions of GNU make, variable names cannot contain whitespace.

Multi-line bash commands in makefile

Considering that every command is run in its own shell, what is the best way to run a multi-line bash command in a makefile? For example, like this:
for i in `find`
do
all="$all $i"
done
gcc $all
You can use backslash for line continuation. However note that the shell receives the whole command concatenated into a single line, so you also need to terminate some of the lines with a semicolon:
foo:
for i in `find`; \
do \
all="$$all $$i"; \
done; \
gcc $$all
But if you just want to take the whole list returned by the find invocation and pass it to gcc, you actually don't necessarily need a multiline command:
foo:
gcc `find`
Or, using a more shell-conventional $(command) approach (notice the $ escaping though):
foo:
gcc $$(find)
As indicated in the question, every sub-command is run in its own shell. This makes writing non-trivial shell scripts a little bit messy -- but it is possible! The solution is to consolidate your script into what make will consider a single sub-command (a single line).
Tips for writing shell scripts within makefiles:
Escape the script's use of $ by replacing with $$
Convert the script to work as a single line by inserting ; between commands
If you want to write the script on multiple lines, escape end-of-line with \
Optionally start with set -e to match make's provision to abort on sub-command failure
This is totally optional, but you could bracket the script with () or {} to emphasize the cohesiveness of a multiple line sequence -- that this is not a typical makefile command sequence
Here's an example inspired by the OP:
mytarget:
{ \
set -e ;\
msg="header:" ;\
for i in $$(seq 1 3) ; do msg="$$msg pre_$${i}_post" ; done ;\
msg="$$msg :footer" ;\
echo msg=$$msg ;\
}
The ONESHELL directive allows to write multiple line recipes to be executed in the same shell invocation.
all: foo
SOURCE_FILES = $(shell find . -name '*.c')
.ONESHELL:
foo: ${SOURCE_FILES}
FILES=()
for F in $^; do
FILES+=($${F})
done
gcc "$${FILES[#]}" -o $#
There is a drawback though : special prefix characters (‘#’, ‘-’, and ‘+’) are interpreted differently.
https://www.gnu.org/software/make/manual/html_node/One-Shell.html
Of course, the proper way to write a Makefile is to actually document which targets depend on which sources. In the trivial case, the proposed solution will make foo depend on itself, but of course, make is smart enough to drop a circular dependency. But if you add a temporary file to your directory, it will "magically" become part of the dependency chain. Better to create an explicit list of dependencies once and for all, perhaps via a script.
GNU make knows how to run gcc to produce an executable out of a set of .c and .h files, so maybe all you really need amounts to
foo: $(wildcard *.h) $(wildcard *.c)
What's wrong with just invoking the commands?
foo:
echo line1
echo line2
....
And for your second question, you need to escape the $ by using $$ instead, i.e. bash -c '... echo $$a ...'.
EDIT: Your example could be rewritten to a single line script like this:
gcc $(for i in `find`; do echo $i; done)

Resources