I am trying to build a generic task that will execute other task. What I need it to do is to loop against directories and use each dir name executing other task for it.
This is what I have:
# GENERIC TASKS
all-%:
for BIN in `ls cmd`; do
#$(MAKE) --no-print-directory BIN=$(BIN) $*
done
But I get this error, could anyone explain to me how can I make it work
bash
➜ make all-build
for BIN in `ls cmd`; do
/bin/sh: -c: line 1: syntax error: unexpected end of file
make: *** [all-build] Error 2
UPDATE
this is how the complete flow of my makefile looks like:
all-%:
for BIN in `ls cmd`; do \
#$(MAKE) --no-print-directory BIN=$BIN $*; \
done
build-%:
#$(MAKE) --no-print-directory BIN=$* build
build:
docker build --no-cache --build-arg BIN=$(BIN) -t $(BIN) .
Each line of a make-recipe is executed in a distinct invocation of the shell.
Your recipe fails with a shell-syntax error because this line:
for BIN in `ls cmd`; do
is not a valid shell command. Nor is the third line:
done
To have all three lines executed in a single shell you must join them
into a single shell command with make's line-continuation character \:
# GENERIC TASKS
all-%:
for BIN in `ls cmd`; do \
#$(MAKE) --no-print-directory BIN=$$BIN $*; \
done
Note also BIN=$$BIN, not $(BIN). BIN is a shell variable here, not a make variable: $$ escapes $-expansion by make, to preserve the shell-expansion $BIN.
Using ls to drive the loop in Make is an antipattern even in shell script (you want for f in cmd/* if I'm guessing correctly) but doubly so in a Makefile. A proper design would be to let make know what the dependencies are, and take it from there.
all-%: %: $(patsubst cmd/%,%,$(wildcard cmd/*))
$(MAKE) --no-print-directory -$(MAKEFLAGS) BIN=$< $*
TESTS=test/hello.c
try:
#for t in $(TESTS); do echo $(basename $$t); done
Running "make" I get
~ make
test/hello.c
This is strange because I expected to get the base name "hello.c". Any explanation? Thanks.
The trouble is that you're trying to use the Make directive basename inside the shell for loop. Make will expand the $(basename ...) statement first (so $(basename $$t) becomes $t), then pass the command
for t in test/hello.c; do echo $t; done
to the shell.
This will give test/hello:
#for t in $(basename $(TESTS)); do echo $$t; done
and this will give hello.c:
#for t in $(notdir $(TESTS)); do echo $$t; done
I have few simple targets which create some files for me.
Example:
$(MAKE_INA):
#echo Building ASM compilation flags file $(notdir $(MAKE_INA))
#$(foreach i, $(sort $(ASMFLAGS) $(PFLAGS) $(ALL_INC_DIR) $(cppGetPreProcessorDefines)), $(shell echo $i >> $# ))
The target works fine, the file is being created and echo text displayed, but in that order (first the file is build then the echo is shown on cmd.exe console).
I guess that is related somehow with output buffering, but I was not able to find the way to flush the echos immediately.
Any hint? Is it even possible?
I am using Gnu Make 4.0
You are mixing up contexts here.
The first #echo line is a recipe line and is run by the shell when the target runs.
The second $(foreach) line is within the rule but is a make context line and is evaluated by make before running the recipe lines. Within that line $(shell) is also a make command and is run during the make expansion of the recipe instead of being run by the shell at recipe execution time.
To do what you want you can just use:
$(MAKE_INA):
#echo Building ASM compilation flags file $(notdir $(MAKE_INA))
#printf "%s\\n" $(sort $(ASMFLAGS) $(PFLAGS) $(ALL_INC_DIR) $(cppGetPreProcessorDefines)) >> $#
Which does the echoing at recipe execution time (so has the right order) and uses a single call to the printf built-in to output to the file instead of running N calls to echo.
Edit: For Windows cmd.exe compat you need to use echo $i >> $# & as the $(foreach) body so that cmd.exe runs multiple commands correctly.
If you did want to keep the N echo calls then you could use:
$(MAKE_INA):
#echo Building ASM compilation flags file $(notdir $(MAKE_INA))
#$(foreach i, $(sort $(ASMFLAGS) $(PFLAGS) $(ALL_INC_DIR) $(cppGetPreProcessorDefines)), echo $i >> $#; ))
Which has the $(foreach) output echo XXX >> $#; ....; echo ZZZ >> $#; as the recipe line to then execute during recipe execution.
I'm stuck trying to figure out how to run a program, on a set of files, using GNU Make:
I have a variable that loads some filenames alike this:
FILES=$(shell ls *.pdf)
Now I'm wanting to run a program 'p' on each of the files in 'FILES', however I can't seem to figure how to do exactly that.
An example of the 'FILES' variable would be:
"a.pdf k.pdf omg.pdf"
I've tried the $(foreach,,) without any luck, and #!bin/bash like loops seem to fail.
You can do a shell loop within the command:
all:
for x in $(FILES) ; do \
p $$x ; \
done
(Note that only the first line of the command must start with a tab, the others can have any old whitespace.)
Here's a more Make-style approach:
TARGETS = $(FILES:=_target)
all: $(TARGETS)
#echo done
.PHONY: $(TARGETS)
$(TARGETS): %_target : %
p $*
Is there a better way to source a script, which sets env vars, from within a makefile?
FLAG ?= 0
ifeq ($(FLAG),0)
export FLAG=1
/bin/myshell -c '<source scripts here> ; $(MAKE) $#'
else
...targets...
endif
Makefile default shell is /bin/sh which does not implement source.
Changing shell to /bin/bash makes it possible:
# Makefile
SHELL := /bin/bash
rule:
source env.sh && YourCommand
To answer the question as asked: you can't.
The basic issue is that a child process can not alter the parent's environment. The shell gets around this by not forking a new process when source'ing, but just running those commands in the current incarnation of the shell. That works fine, but make is not /bin/sh (or whatever shell your script is for) and does not understand that language (aside from the bits they have in common).
Chris Dodd and Foo Bah have addressed one possible workaround, so I'll suggest another (assuming you are running GNU make): post-process the shell script into make compatible text and include the result:
shell-variable-setter.make: shell-varaible-setter.sh
postprocess.py #^
# ...
else
include shell-variable-setter.make
endif
messy details left as an exercise.
If your goal is to merely set environment variables for Make, why not keep it in Makefile syntax and use the include command?
include other_makefile
If you have to invoke the shell script, capture the result in a shell command:
JUST_DO_IT=$(shell source_script)
the shell command should run before the targets. However this won't set the environment variables.
If you want to set environment variables in the build, write a separate shell script that sources your environment variables and calls make. Then, in the makefile, have the targets call the new shell script.
For example, if your original makefile has target a, then you want to do something like this:
# mysetenv.sh
#!/bin/bash
. <script to source>
export FLAG=1
make "$#"
# Makefile
ifeq($(FLAG),0)
export FLAG=1
a:
./mysetenv.sh a
else
a:
.. do it
endif
Using GNU Make 3.81 I can source a shell script from make using:
rule:
<tab>source source_script.sh && build_files.sh
build_files.sh "gets" the environment variables exported by source_script.sh.
Note that using:
rule:
<tab>source source_script.sh
<tab>build_files.sh
will not work. Each line is ran in its own subshell.
This works for me. Substitute env.sh with the name of the file you want to source. It works by sourcing the file in bash and outputting the modified environment, after formatting it, to a file called makeenv which is then sourced by the makefile.
IGNORE := $(shell bash -c "source env.sh; env | sed 's/=/:=/' | sed 's/^/export /' > makeenv")
include makeenv
Some constructs are the same in the shell and in GNU Make.
var=1234
text="Some text"
You can alter your shell script to source the defines. They must all be simple name=value types.
Ie,
[script.sh]
. ./vars.sh
[Makefile]
include vars.sh
Then the shell script and the Makefile can share the same 'source' of information. I found this question because I was looking for a manifest of common syntax that can be used in Gnu Make and shell scripts (I don't care which shell).
Edit: Shells and make understand ${var}. This means you can concatenate, etc,
var="One string"
var=${var} "Second string"
I really like Foo Bah's answer where make calls the script, and the script calls back to make. To expand on that answer I did this:
# Makefile
.DEFAULT_GOAL := all
ifndef SOME_DIR
%:
<tab>. ./setenv.sh $(MAKE) $#
else
all:
<tab>...
clean:
<tab>...
endif
--
# setenv.sh
export SOME_DIR=$PWD/path/to/some/dir
if [ -n "$1" ]; then
# The first argument is set, call back into make.
$1 $2
fi
This has the added advantage of using $(MAKE) in case anyone is using a unique make program, and will also handle any rule specified on the command line, without having to duplicate the name of each rule in the case when SOME_DIR is not defined.
If you want to get the variables into the environment, so that they are passed to child processes, then you can use bash's set -a and set +a. The former means, "When I set a variable, set the corresponding environment variable too." So this works for me:
check:
bash -c "set -a && source .env.test && set +a && cargo test"
That will pass everything in .env.test on to cargo test as environment variables.
Note that this will let you pass an environment on to sub-commands, but it won't let you set Makefile variables (which are different things anyway). If you need the latter, you should try one of the other suggestions here.
My solution to this: (assuming you're have bash, the syntax for $# is different for tcsh for instance)
Have a script sourceThenExec.sh, as such:
#!/bin/bash
source whatever.sh
$#
Then, in your makefile, preface your targets with bash sourceThenExec.sh, for instance:
ExampleTarget:
bash sourceThenExec.sh gcc ExampleTarget.C
You can of course put something like STE=bash sourceThenExec.sh at the top of your makefile and shorten this:
ExampleTarget:
$(STE) gcc ExampleTarget.C
All of this works because sourceThenExec.sh opens a subshell, but then the commands are run in the same subshell.
The downside of this method is that the file gets sourced for each target, which may be undesirable.
Depending on your version of Make and enclosing shell, you can implement a nice solution via eval, cat, and chaining calls with &&:
ENVFILE=envfile
source-via-eval:
#echo "FOO: $${FOO}"
#echo "FOO=AMAZING!" > $(ENVFILE)
#eval `cat $(ENVFILE)` && echo "FOO: $${FOO}"
And a quick test:
> make source-via-eval
FOO:
FOO: AMAZING!
An elegant solution found here:
ifneq (,$(wildcard ./.env))
include .env
export
endif
If you need only a few known variables exporting in makefile can be an option, here is an example of what I am using.
$ grep ID /etc/os-release
ID=ubuntu
ID_LIKE=debian
$ cat Makefile
default: help rule/setup/lsb
source?=.
help:
-${MAKE} --version | head -n1
rule/setup/%:
echo ID=${#F}
rule/setup/lsb: /etc/os-release
${source} $< && export ID && ${MAKE} rule/setup/$${ID}
$ make
make --version | head -n1
GNU Make 3.81
. /etc/os-release && export ID && make rule/setup/${ID}
make[1]: Entering directory `/tmp'
echo ID=ubuntu
ID=ubuntu
--
http://rzr.online.fr/q/gnumake
Assuming GNU make, can be done using a submake. Assuming that the shell script that exports the variables is include.sh in the current directory, move your Makefile to realmake.mk. Create a new Makefile:
all:
#. ./include.sh; \
$(MAKE) -f realmake.mk $(MAKECMDGOALS)
$(MAKECMDGOALS):
+#. ./include.sh; \
$(MAKE) -f realmake.mk $(MAKECMDGOALS)
Pay attention to the ./ preceding include.sh.
Another possible way would be to create a sh script, for example run.sh, source the required scripts and call make inside the script.
#!/bin/sh
source script1
source script2 and so on
make
target: output_source
bash ShellScript_name.sh
try this it will work, the script is inside the current directory.