In my Makefile deploy target I create environment variables and I want to reuse those in the following lines:
SHELL=/bin/sh
deploy:
export $(shell sh this-script-generate-key-values.sh | xargs)
echo ${VAR1} #there is no variable here
echo ${VAR2} #there is no variable here
Where:
this-script-generate-key-values.sh generates this output:
VAR1="somevalue"
VAR2="somevalue"
Why the variables are not set in subsequent lines? How can I make it work?
Notes:
This line works: sh this-script-generate-key-values.sh | xargs
The shell must be /bin/sh (no bash)
All lines in a Makefile recipe run in a separate shell. You need to run the lines in a single shell. Also you need to escape the dollar sign ($) so that variable substitution is not done by make but by the shell.
SHELL=/bin/sh
deploy:
export $$(this-script-generate-key-values.sh | xargs) ;\
echo $${VAR1} ;\
echo $${VAR2}
Just to expand on my comment -- you could output to a file, and use the file to generate your output as so:
vars.txt:
this-script-generate-key-values.sh > $#
deploy : vars.txt
echo VAR1=$$(sed -n 's|VAR1=\(.*\)|\1|p' vars.txt)
echo VAR2=$$(sed -n 's|VAR2=\(.*\)|\1|p' vars.txt)
note: you may have to generate dependencies for vars.txt or declare it .PHONY, otherwise, this will not run on every invocation of make.
If the .ONESHELL special target appears anywhere in the makefile then all recipe lines for each target will be provided to a single invocation of the shell. Newlines between recipe lines will be preserved.
.ONESHELL:
deploy:
export $$(this-script-generate-key-values.sh)
echo $${VAR1}
echo $${VAR2}
Suppose I write:
SHELL:=/usr/bin/env bash
commands:
source sourceme.sh && alias
But I wanted to push that source sourceme.sh back into the SHELL declaration:
SHELL:=/usr/bin/env bash -c "source sourceme.sh" # or something along these lines
Can this be done, and if so, how?
No, you can't do that. Make takes your recipe lines and sends them to, effectively, $(SHELL) -c <recipeline> The shell, unfortunately, doesn't accept multiple -c options so to do what you want you'd need to have a way for make to insert that string at the beginning of every recipe line and there's no way to do that.
You can do it yourself, from outside your makefile, by writing your own wrapper:
$ cat wrapper.sh
#!/usr/bin/env bash
shift
source sourceme.sh
eval "$#"
$ cat Makefile
SHELL := wrapper.sh
commands:
alias
I know I can run an "original" command (not alias) using either \ or "":
\ls
"ls"
This doesn't work for functions though. Also it requires me to use that syntax every time.
Is it possible in a sourced script to disable all functions/aliases from the parent process (one which runs my script)? I.e. if a user in their terminal has some aliases functions defined I want them disabled in my script (but of course I still want to be able to define and use aliases/functions of my own).
Types of Commands in Bash
Bash knows different types of commands which can shadow each other. The precedence of these types is:
aliases
can be defined by the user using alias cmd=...
functions
can be defined by the user using cmd() { ... }
built-ins
are directly implement in bash and cannot be altered. help and enable list all built-ins.
Executable files in $PATH
Meaning if you type cmd arg1 arg2 ... you use the alias cmd if it is defined, otherwise you use the function cmd if it is defined, otherwise you use the built-in cmd if it is built-in, otherwise you use the first executable cmd from the directories in $PATH if there is one, otherwise you end up with the error -bash: cmd command not found.
Which of these cases applies for cmd can be checked using type -a cmd.
Manual Precedence Control
Bash allows you to influence which type to pick using quoting and the built-ins command and builtin.
\cmd
suppresses aliases
uses functions, built-ins, executables
command cmd
suppresses aliases and functions
uses built-ins and executables
builtin cmd
supresses aliases, functions, and executables
uses only built-ins
enable -n cmd
disables the built-in cmd completely, such that afterwards only
aliases, functions, and executables are used
env cmd
not a bash built-in, therefore it doesn't really suppress anything but
uses only executables
Examples
Shadowing is perfectly normal. For instance, bash has its own built-in echo, but your system also has /bin/echo. Both implementations may differ. For instance, my echo from bash 5 supports \uXXXX but my echo from GNU coreutils 8.3 does not. The possibility of such differences becomes even more clear if you add your own implementations using aliases and functions. Here's an example in an interactive bash session ($ is the prompt):
$ echo() { printf "function echo: %s\n" "$*"; }
$ alias echo='printf "alias echo: %s %s %s\n"'
$ type -a echo
echo is aliased to `printf "alias echo: %s %s %s\n"'
echo is a function
echo ()
{
printf "function echo: %s\n" "$*"
}
echo is a shell builtin
echo is /bin/echo
$ echo -e '\u2261'
alias echo: -e \u2261
$ \echo -e '\u2261'
function echo: -e \u2261
# use the built-in (or executable file if there was no such built-in)
$ command echo -e '\u2261'
≡
$ builtin echo -e '\u2261'
≡
# use the executable /bin/echo
$ env echo -e '\u2261'
\u2261
$ enable -n echo
# use the executable /bin/echo (`command` is needed to skip the alias and function)
$ command echo -e '\u2261'
\u2261
Answering your Question
Unfortunately I'm not aware of something like enable to permanently disable alias and function lookup. You could try some hacks like backing up all aliases and functions, doing unset -f and unalias on them, and restoring them at the end. However, unset may fail for readonly functions. The better way would be to use bash -c '... functions and aliases have no effect here ...' for the parts where you don't really need the benefits of source. For the other parts, prefix everything with command.
Please note: The caller who sources your script may even disable or shadow command, builtin, and so on -- therefore you can never be sure that you are actually using the commands you expected. Even writing /usr/bin/env executable or /path/to/the/executable does not help as a function can have the name and $PATH or the file system can be altered.
However, that shouldn't be your concern. The one who sources your script should be responsible for providing the correct environment.
Edit: this answer might no longer be relevant since you edited the question to clarify that the script is being sourced, not being executed in a subshell.
This happens by default. Proof:
$ function x() { echo 'hi'; }
$ x
hi
$ bash
# We are now in a subshell.
$ x
bash: x: command not found
Functions are often defined in one of the shell's startup files: .bashrc, .profile or .bash_profile. Which of these are sourced depends on whether the shell is a login shell and/or an interactive shell. A shell that invoked to execute a shell script is neither a login shell nor an interactive shell, and in this case none of those files are sourced.
EDIT: I should read more carefully, as you don't want to source a script, but be sourced, the following is for the other way around:
Functions
If you source your parent script at the beginning, you can just loop through the defined functions and unset them.
declare -F will list all defined functions but in the format declare -f functioname, so you have to get only the name:
IFS=$'\n'
for f in $(declare -F|cut -d ' ' -f 3); do
unset -f $f
done
Aliases
Alias should not be sourced in as i remember, but if they are there you can do
unalias -a
to unset them all.
This is printing the current dir, not the parent:
run:
#cd ..; \
echo $(shell pwd)
I need the parent dir in a command like:
run:
#cd ..; \
docker run -it --rm -p 8080:8080 -v $(shell pwd):/go/src/hello golang bash
Remember that make works by invoking a shell and sending the recipe to the shell for execution. Make doesn't have a shell "built in", so it's not running recipes directly.
The problem is that $(shell ..) is a make function. All make variables and functions are expanded before the shell is invoked (consider: the shell doesn't know how to handle make functions).
That means that a make function like $(shell ...) is first expanded and pwd is run, which gives you the current directory that the make process is running in, then the resulting string is passed to the shell for execution. So the shell sees this:
cd ..; echo /path/to/make/dir
You never need to use the $(shell ...) function inside a recipe; the recipe is already running in a shell! Instead you want to use shell syntax inside a recipe. The one caveat to this is that you have to escape dollar signs (replacing shell $ with $$) so that make doesn't interpret them as make variables. So if you write:
run:
#cd ..; echo $$(pwd)
then make expands that string and sends this command to the shell:
cd ..; echo $(pwd)
which works as you want.
Why not use the POSIXly mandated variable PWD?
run:
#cd ..; echo $$PWD
Save a process today!
Is there a better way to source a script, which sets env vars, from within a makefile?
FLAG ?= 0
ifeq ($(FLAG),0)
export FLAG=1
/bin/myshell -c '<source scripts here> ; $(MAKE) $#'
else
...targets...
endif
Makefile default shell is /bin/sh which does not implement source.
Changing shell to /bin/bash makes it possible:
# Makefile
SHELL := /bin/bash
rule:
source env.sh && YourCommand
To answer the question as asked: you can't.
The basic issue is that a child process can not alter the parent's environment. The shell gets around this by not forking a new process when source'ing, but just running those commands in the current incarnation of the shell. That works fine, but make is not /bin/sh (or whatever shell your script is for) and does not understand that language (aside from the bits they have in common).
Chris Dodd and Foo Bah have addressed one possible workaround, so I'll suggest another (assuming you are running GNU make): post-process the shell script into make compatible text and include the result:
shell-variable-setter.make: shell-varaible-setter.sh
postprocess.py #^
# ...
else
include shell-variable-setter.make
endif
messy details left as an exercise.
If your goal is to merely set environment variables for Make, why not keep it in Makefile syntax and use the include command?
include other_makefile
If you have to invoke the shell script, capture the result in a shell command:
JUST_DO_IT=$(shell source_script)
the shell command should run before the targets. However this won't set the environment variables.
If you want to set environment variables in the build, write a separate shell script that sources your environment variables and calls make. Then, in the makefile, have the targets call the new shell script.
For example, if your original makefile has target a, then you want to do something like this:
# mysetenv.sh
#!/bin/bash
. <script to source>
export FLAG=1
make "$#"
# Makefile
ifeq($(FLAG),0)
export FLAG=1
a:
./mysetenv.sh a
else
a:
.. do it
endif
Using GNU Make 3.81 I can source a shell script from make using:
rule:
<tab>source source_script.sh && build_files.sh
build_files.sh "gets" the environment variables exported by source_script.sh.
Note that using:
rule:
<tab>source source_script.sh
<tab>build_files.sh
will not work. Each line is ran in its own subshell.
This works for me. Substitute env.sh with the name of the file you want to source. It works by sourcing the file in bash and outputting the modified environment, after formatting it, to a file called makeenv which is then sourced by the makefile.
IGNORE := $(shell bash -c "source env.sh; env | sed 's/=/:=/' | sed 's/^/export /' > makeenv")
include makeenv
Some constructs are the same in the shell and in GNU Make.
var=1234
text="Some text"
You can alter your shell script to source the defines. They must all be simple name=value types.
Ie,
[script.sh]
. ./vars.sh
[Makefile]
include vars.sh
Then the shell script and the Makefile can share the same 'source' of information. I found this question because I was looking for a manifest of common syntax that can be used in Gnu Make and shell scripts (I don't care which shell).
Edit: Shells and make understand ${var}. This means you can concatenate, etc,
var="One string"
var=${var} "Second string"
I really like Foo Bah's answer where make calls the script, and the script calls back to make. To expand on that answer I did this:
# Makefile
.DEFAULT_GOAL := all
ifndef SOME_DIR
%:
<tab>. ./setenv.sh $(MAKE) $#
else
all:
<tab>...
clean:
<tab>...
endif
--
# setenv.sh
export SOME_DIR=$PWD/path/to/some/dir
if [ -n "$1" ]; then
# The first argument is set, call back into make.
$1 $2
fi
This has the added advantage of using $(MAKE) in case anyone is using a unique make program, and will also handle any rule specified on the command line, without having to duplicate the name of each rule in the case when SOME_DIR is not defined.
If you want to get the variables into the environment, so that they are passed to child processes, then you can use bash's set -a and set +a. The former means, "When I set a variable, set the corresponding environment variable too." So this works for me:
check:
bash -c "set -a && source .env.test && set +a && cargo test"
That will pass everything in .env.test on to cargo test as environment variables.
Note that this will let you pass an environment on to sub-commands, but it won't let you set Makefile variables (which are different things anyway). If you need the latter, you should try one of the other suggestions here.
My solution to this: (assuming you're have bash, the syntax for $# is different for tcsh for instance)
Have a script sourceThenExec.sh, as such:
#!/bin/bash
source whatever.sh
$#
Then, in your makefile, preface your targets with bash sourceThenExec.sh, for instance:
ExampleTarget:
bash sourceThenExec.sh gcc ExampleTarget.C
You can of course put something like STE=bash sourceThenExec.sh at the top of your makefile and shorten this:
ExampleTarget:
$(STE) gcc ExampleTarget.C
All of this works because sourceThenExec.sh opens a subshell, but then the commands are run in the same subshell.
The downside of this method is that the file gets sourced for each target, which may be undesirable.
Depending on your version of Make and enclosing shell, you can implement a nice solution via eval, cat, and chaining calls with &&:
ENVFILE=envfile
source-via-eval:
#echo "FOO: $${FOO}"
#echo "FOO=AMAZING!" > $(ENVFILE)
#eval `cat $(ENVFILE)` && echo "FOO: $${FOO}"
And a quick test:
> make source-via-eval
FOO:
FOO: AMAZING!
An elegant solution found here:
ifneq (,$(wildcard ./.env))
include .env
export
endif
If you need only a few known variables exporting in makefile can be an option, here is an example of what I am using.
$ grep ID /etc/os-release
ID=ubuntu
ID_LIKE=debian
$ cat Makefile
default: help rule/setup/lsb
source?=.
help:
-${MAKE} --version | head -n1
rule/setup/%:
echo ID=${#F}
rule/setup/lsb: /etc/os-release
${source} $< && export ID && ${MAKE} rule/setup/$${ID}
$ make
make --version | head -n1
GNU Make 3.81
. /etc/os-release && export ID && make rule/setup/${ID}
make[1]: Entering directory `/tmp'
echo ID=ubuntu
ID=ubuntu
--
http://rzr.online.fr/q/gnumake
Assuming GNU make, can be done using a submake. Assuming that the shell script that exports the variables is include.sh in the current directory, move your Makefile to realmake.mk. Create a new Makefile:
all:
#. ./include.sh; \
$(MAKE) -f realmake.mk $(MAKECMDGOALS)
$(MAKECMDGOALS):
+#. ./include.sh; \
$(MAKE) -f realmake.mk $(MAKECMDGOALS)
Pay attention to the ./ preceding include.sh.
Another possible way would be to create a sh script, for example run.sh, source the required scripts and call make inside the script.
#!/bin/sh
source script1
source script2 and so on
make
target: output_source
bash ShellScript_name.sh
try this it will work, the script is inside the current directory.