Makefile recipe with a here-document redirection - bash

Does anyone know how to use a here-document redirection on a recipe?
test:
sh <<EOF
echo I Need This
echo To Work
ls
EOF
I can't find any solution trying the usual backslash method (which basically ends with a command in a single line).
Rationale:
I have a set of multi-line recipes that I want to proxy through another command (e.g., sh, docker).
onelinerecipe := echo l1
define twolinerecipe :=
echo l1
echo l2
endef
define threelinerecipe :=
echo l1
echo l2
echo l3
endef
# sh as proxy command and proof of concept
proxy := sh
test1:
$(proxy) <<EOF
$(onelinerecipe)
EOF
test2:
$(proxy) <<EOF
$(twolinerecipe)
EOF
test3:
$(proxy) <<EOF
$(threelinerecipe)
EOF
The solution I would love to avoid: transform multiline macros into single lines.
define threelinerecipe :=
echo l1;
echo l2;
echo l3
endef
test3:
$(proxy) <<< "$(strip $(threelinerecipe))"
This works (I use gmake 4.0 and bash as make's shell) but it requires changing my recipes and I have a lot.
Strip removes the newlines, from the macro, then everything is written in a single line.
My end goal is: proxy := docker run ...

Using the line .ONESHELL: somewhere in your Makefile will send all recipe lines to a single shell invocation, you should find your original Makefile works as expected.

When make sees a multi-line block in a recipe
(i.e., a block of lines all ending in \, apart from the last),
it passes that block un-modifed to the shell.
This generally works in bash,
apart from here docs.
One way around this is to strip any trailing \s,
then pass the resulting string to bash's eval.
You do this in make by playing with ${.SHELLFLAGS} and ${SHELL}.
You can use both of these in target-specific form if you only want it to kick in for a few targets.
.PHONY: heredoc
heredoc: .SHELLFLAGS = -c eval
heredoc: SHELL = bash -c 'eval "$${#//\\\\/}"'
heredoc:
#echo First
#cat <<-there \
here line1 \
here anotherline \
there
#echo Last
giving
$ make
First
here line1
here anotherline
Last
Careful with that quoting, Eugene.
Note the cheat here:
I am removing all backslashes,
not just the ones at the ends of the line.
YMMV.

With GNU make, you can combine multi-line variables with the export directive to use a multi-line command without having to turn on .ONESHELL globally:
define script
cat <<'EOF'
here document in multi-line shell snippet
called from the "$#" target
EOF
endef
export script
run:; # eval "$$script"
will give
here document in multi-line shell snippet
called from the "run" target
You can also combine it with the value function to prevent its value from being expanded by make:
define _script
cat <<EOF
SHELL var expanded by the shell to $SHELL, pid is $$
EOF
endef
export script = $(value _script)
run:; # eval "$$script"
will give
SHELL var expanded by the shell to /bin/sh, pid is 12712

Not a here doc but this might be a useful workaround.
And it doesn’t require any GNU Make’isms.
Put the lines in a subshell with parens, prepend each line with echo.
You’ll need trailing sloshes and semi-colon and slosh where appropriate.
test:
( \
echo echo I Need This ;\
echo echo To Work ;\
echo ls \
) \
| sh

Related

Is there any alternative to using eval in a shell script to achieve variable expansion

I have the following case where exec and eval will handle variables passed as arguments differently.
Here, eval seems to output something which is intended.
But is there any alternative to using that?
$ cat arg.sh
#!/bin/bash
eval ./argtest $*
$ ./arg.sh "arg1 'subarg1 subarg2'"
Args: 2
Arg1: arg1
Arg2: subarg1 subarg2
But at the same time if I use exec instead of eval call, the single quotes are not getting honored.
$ ./arg.sh "arg1 'subarg1 subarg2'"
Args: 3
Arg1: arg1
Arg2: 'subarg1
Arg3: subarg2'
You should do:
#!/bin/bash
./argtest "$#"
To properly pass unchanged arguments.
Then do:
$ ./arg.sh arg1 'subarg1 subarg2'
As you would do with any other command.
Research when to use quoting in shell, how is $# positional arguments expansions handled specially in quotes, research how does $* and $# differ and research word splitting. Also research what is variable expansion and in which contexts it happens and how does single quotes differ from double quotes. And because exec is mentioned see bashfaq Eval command and security issues. Remember to check your scripts with https://shellcheck.net .
Is there any alternative to using eval in a shell script to achieve variable expansion
Yes - use envsubst for variable expansion, it's a tool just for that.
#!/bin/bash
arg=$(VARIABLE=something envsubst '$VARIABLE' <<<"$1")
./argtest "$arg"
$ bash -x ./arg.sh 'string with **not-expanded** $VARIABLE'
+ ./argtest 'string with **not-expanded** something'
Is there any alternative to using eval in a shell script to achieve *single quotes parsing
Yes - you would potentially write your own parser, probably in awk, that would split the string and then reload. A very very crude example:
#!/bin/bash
readfile -t args < <(sed "s/ *'\([^']*\)' */\n\1\n/; s/\n$//" <<<"$*")
./argtest "${args[#]}"
$ bash -x ./arg.sh "arg1 'subarg1 subarg2'"
+ ./argtest 'arg1' 'subarg1 subarg2'
Using $*, the shell applies word splitting to the parameters and passes the effect after word splitting to eval, repsepcitvely exec. What happens after, differs between them:
exec simply replaces the current process by a new one, based on the first parameter it gets. Than in passes the remaining parameters unmodified to this process.
eval on the other hand catenates the parameters together to a single string (using one space as a separator between those strings), then treats this resulting string as a new command where the usual expansion and word splitting mechanism of bash are applied, and finally runs this command.
The mechanism is completely different, which is not surprising, since these commands serve a different purpose.

Could someone explain me what this shell bash command "echo{,}" means?

If I do this:
echo{,}
The result is:
echo
I don't really understand the {,} at the end and the result
Thanks to clarify this.
I would start with something simpler to see how {} works: As #anubhava linked, it generates strings. Essentially, it expands all the elements in it and combines them with whatever is before and after it (space is separator if you don't quote).
Example:
$ bash -xc 'echo before{1,2}after and_sth_else'
+ echo before1after before2after and_sth_else
before1after before2after and_sth_else
Note that there is a space between echo and the arguments. This is not the case on what you have posted. So what happened there? Check the following:
$ bash -xc 'man{1,2}'
+ man1 man2
bash: man1: command not found
The result of the expansion is fed to bash and bash tries to execute it. In the above case, the command that is looking for is man1 (which does not exist).
Finally, combine the above to your question:
echo{,}
{,} expands to two empty elements/strings
These are then prefixed/concatenated with "echo" so we now have echo echo
Expansion finished and this is given to bash to execute
Command is echo and its first argument is "echo"... so it echoes echo!
echo{,}
is printing just echo because it is equivalent of echo echo.
More examples to clarify:
bash -xc 'echo{,}'
+ echo echo
echo
echo foo{,}
foo foo
echo foo{,,}
foo foo foo
More about Brace Expansion
Brace expansion is a mechanism by which arbitrary strings may be generated. This mechanism is similar to pathname expansion, but the filenames generated
need not exist. Patterns to be brace expanded take the form of an optional preamble, followed by either a series of comma-separated strings or a sequence
expression between a pair of braces, followed by an optional postscript. The preamble is prefixed to each string contained within the braces, and the
postscript is then appended to each resulting string, expanding left to right.
The {item1,item2,...} is a brace expansion.
So echo{,} is expanded as echo echo because {,} has two (empty) elements, then echo prints it argument.
Try this :
$ set -x
$ echo{,}
+ echo echo
echo
$ set +x
+ set +x
$
It's also handy to generate "cross products" without nested loops:
$ ary=( {1,2,3}{a,b,c} )
$ declare -p ary
declare -a ary=([0]="1a" [1]="1b" [2]="1c" [3]="2a" [4]="2b" [5]="2c" [6]="3a" [7]="3b" [8]="3c")

Strange issue resolving bash environmental variable in nested double quotes

I have a setup script that needs to be run remotely on an arbitrary machine (can be windows). So I had something along the lines of bash -c "do things that need environmental variables".
I found some strange things happening with nested quotes + enviornmental variables that I don't understand (demonstrated below)
# This worked because my environment was polluted.
bash -c "NAME=me echo $NAME"
> me
# I think this was a weird cross platform issue with how I was running.
# I couldn't reproduce it locally.
bash -c "NAME=me echo "Hi $NAME""
> Hi $NAME
# This was my workaround, and I have no clue why this works.
# I get that "Start "" end" does string concatenation in bash,
# but I have no clue why that would make this print 'Hi me' instead
# of 'Hi'.
#
# This works because echo Hi name prints "Hi name". I thought echo only
# took the first argument passed in.
bash -c "NAME=me echo Hi "" $NAME"
> Hi me
# This is the same as the first case. NAME was just empty this time.
bash -c "NAME=me echo Hi $NAME"
> Hi
Edit: A bunch of people have pointed out that the variables get expanded in double quotes before bash -c gets run. This makes sense, but I feel like it doesn't explain why case 1 works.
shouldn't bash -c "NAME=me echo $NAME" be expanded to bash -c "NAME=me echo ", since NAME isn't set before we run this?
Edit 2: A bunch of this stuff worked because my environment was polluted. I've tried to describe what mistakes I made in my assumptions
There are at least three sources of confusion here: quotes don't (generally) nest, $variable references are expanded by the shell even if they're in double-quotes, and variable references are resolved before var=value assignments are done.
Let me look at the second problem first. Here's an interactive example showing the effect:
$ NAME=Gordon
$ bash -c "NAME=me echo $NAME"
Gordon
Here, the outer (interactive) shell expanded $NAME before passing it to bash -c, so the command essentially became bash -c "NAME=me echo Gordon". There are several ways to avoid this: you can escape the $ to remove its normal effect (but the escape gets removed, so the inner shell will see it and apply it normally), or use single-quotes instead of double (which remove the special effect of all characters, except for another single-quote which ends the single-quoted string). So let's try those:
$ bash -c "NAME=me echo \$NAME"
$ bash -c 'NAME=me echo $NAME'
(You can't really see it, but there's a blank line after the second command as well, because it didn't print anything either.) What happened here is that the inner shell (the one created by bash -c) indeed got the command NAME=me echo $NAME, but when executing it expands $NAME first (giving nothing, because it's not defined in that shell), and then executes NAME=me echo which runs the echo command with NAME set to "me" in its environment. Let's try that interactively:
$ NAME=me echo $NAME
Gordon
(Remember that I set NAME=Gordon in my interactive shell earlier.) To get the intended effect, you'd need to set NAME and then as a separate command use it in an echo command:
$ bash -c "NAME=me; echo \$NAME"
me
$ bash -c 'NAME=me; echo $NAME'
me
Ok, with that out of the way let's move on to the original question about quoting. As I said, quotes don't (generally) nest. To understand what's going on, let's analyze some of the example commands. You can get a better idea how the shell interprets things by using set -x, which makes the shell print each command's equivalent just before it's executed:
$ set -x
$ bash -c "NAME=me echo "Hi $NAME""
+ bash -c 'NAME=me echo Hi' Gordon
Hi
What happened here is that the shell parsed "NAME=me echo "Hi as a double-quoted string immediately followed by two unquoted characters; since there's no gap between them, they get merged into a single argument to bash -c. It may seem a little weird having only part of an argument quoted, but it's actually entirely normal in shell syntax. It's even normal to have part of a single argument be unquoted, part single-quoted, part double-quoted, and even part in ANSI-C mode ($'ANSI-c-escaped stuff goes here').
With set -x, bash will print something equivalent to the command being executed. All of these commands are equivalent in shell syntax:
bash -c "NAME=me echo "Hi Gordon
bash -c "NAME=me echo Hi" Gordon
bash -c 'NAME=me echo Hi' Gordon
bash -c NAME=me\ echo\ Hi Gordon
bash -c NAME=me' 'echo' 'Hi Gordon
bash -c 'NAME=me'\ "echo Hi" Gordon
...and lots more. With set -x, bash will print one of these equivalents, and it just happens to choose the one with single-quotes around the entire argument.
Just for completeness, what happened to $NAME""? It's treated as an unquoted variable reference (which expands to Gordon) immediately followed by a zero-length double-quoted string, which doesn't do anything at all.
But... why does that just print "Hi"? Well, bash -c treats the next argument as a command to run, and any further arguments as the argument vector ($0, $1, etc) for that command's environment. Here's an illustration:
$ bash -c 'echo "Args: $0 $1 $2"' zeroth first second third
+ bash -c 'echo "Args: $0 $1 $2"' zeroth first second third
Args: zeroth first second
("third" doesn't get printed because the command doesn't print $3.)
Thus, when you run bash -c 'NAME=me echo Hi' Gordon, it executes NAME=me echo Hi with $0 set to "Gordon".
Ok, here's the last example I'll look at:
$ bash -c "NAME=me echo Hi "" $NAME"
+ bash -c 'NAME=me echo Hi Gordon'
Hi Gordon
What's happening here is that there's a double-quoted section "NAME=me echo Hi " immediately followed by another one, " $NAME", so they get merged into a single long argument (which happens to contain two spaces in a row -- one part of the first quoted section, one part of the second). Essentially, the "" in the middle ends one double-quotes section and immediately starts another, thus having no overall effect. And again, the shell decided to print a single-quoted equivalent rather than any of the various other possible equivalents.
So how do we actually get this to work right? Here's what I'd actually recommend:
$ bash -c 'NAME=me; echo "Hi $NAME"'
+ bash -c 'NAME=me; echo "Hi $NAME"'
Hi me
Since the entire command string is in single-quotes, none of these problems occur. The double-quotes are just normal characters being passed as part of the argument (so double-quotes sort of nest inside single-quotes -- and vice versa -- but it's really just 'cause they're ignored), and the $ doesn't get its special meaning to the outer shell either. Oh, and the ; makes this two separate commands, so the NAME=me part can take effect before the echo "$NAME" part uses it.
Another equivalent would be:
$ bash -c "NAME=me; echo \"Hi \$NAME\""
+ bash -c 'NAME=me; echo "Hi $NAME"'
Hi me
Here the escapes remove the special meanings of the $ and enclosed double-quotes. Note that the shell prints exactly the same thing as last time for its set -x output, indicating that this really is equivalent to the single-quoted version.

GNU Make: shell cat file yields contents without newlines

Makefile:
.PHONY: all
SHELL:=/usr/bin/env bash
all:
$(eval x=$(shell cat file))
#echo "$x"
File:
foo
bar
Output:
foo bar
How do I get the contents of the file into the make variable without losing the newlines?
You can't do this with shell, as described in its documentation.
If you have a sufficiently new version of GNU make, you can use the file function however.
Make converts newlines from shell outputs to spaces (see here):
The shell function performs the same function that backquotes (‘`’)
perform in most shells: it does command expansion. This means that it
takes as an argument a shell command and evaluates to the output of
the command. The only processing make does on the result is to convert
each newline (or carriage-return / newline pair) to a single space. If
there is a trailing (carriage-return and) newline it will simply be
removed.
So, you cannot preserve spaces from the $(shell) command directly. That being said, make does allow multiline variables using define -- but beware, attempting to use such variables is problematic. Consider:
define x
foo
bar
endef
all:
#echo "$x"
Make expands the $x in place, and you end up with:
all:
#echo " foo
bar"
(where the newline is considered the end of the recipe line..).
Depending on what you want this for, you may be able to get around this is using a bash variable:
all:
#x=$$(cat file); \
echo $$x
Or potentially storing your output in a file, and referencing that when necessary.
all:
eval (cat file >> output.txt)
cat output.txt
(and yes, the last one is convoluted as written, but I'm not sure what you're trying to do, and this allows the output of your command to be persistent across recipe lines).
If the file contents are ensured not to contain any binary data, and if you're willing to to extra processing each time you access the variable, then you could:
foo:=$(shell cat file | tr '\n' '\1')
all:
#echo "$(shell echo "$(foo)" | tr '\1' '\n')"
Note that you cannot use nulls \0, and I suspect that probably means there's a buffer overflow bug in my copy of Make.

Makefile subst variable not affected?

I want to perform a string substitution in my Makefile. I can easily do this with a string literal like so:
foo:
echo $(subst /,-,"hello/world")
Which yields the expected:
hello-world
But when I switch to using a variable, I can't seem to get the substitution to stick:
foo:
x="hello/world" ; \
echo $(subst /,-,$$x)
Instead of replacing the slash with a dash, I still get the original string printed back. Can someone explain what is going on here? Does the variable need to be explicitly converted to a string literal or something?
UPDATE:
The fix based on MadScientist's answer--this will allow me to reference the modified string as a variable.
foo:
x="hello/world" ; \
y=`echo $$x | tr / -` ; \
echo $$y
But instead of echo $$y this could be something more useful.
You can't combine make functions with shell variables... all make functions are expanded first, then the resulting script is passed to the shell to be run. When the shell gets the script there are no more make functions in it (and if there were, the shell wouldn't know what to do with them!)
Your subst is running on the literal string $x, which has no / so nothing to replace and results in $x, which the shell expands to the string hello/world.
If you have to work on a shell variable value you must use shell commands such as sed or tr, not make's subst function:
foo:
x="hello/world" ; \
echo $$x | tr / -
You could define x as a make variable:
Makefile:
x = foo bar baz
t:
#echo $(subst bar,qux,$(x))
Output:
make
foo qux baz
Version:
make --version
GNU Make 3.81

Resources