environment variable not set in makefile - makefile

I want to trigger unit test and integration test in a Makefile, my current implementation is like this:
all: unittest integration
unittest:
$(ECHO) #echo 'Running unittest'
#unset TYPE
#nosetests
integration:
$(ECHO) #echo 'Running integration test'
#export TYPE=integration
#nosetests
but I'm having problems with setting environment variables, when I run make integration , the TYPE environment variable would not be set, if I set the environment variable manually with export TYPE=integration, then I run make unittest, the environment variable would not be unset. How to solve this?

Each command in a recipe is run in a separate shell. The shell which runs export TYPE immediately exits; then the next command is run in a new, fresh instance, which of course does not have this setting.
The shell has specific syntax for setting a variable for the duration of one command; use that.
all: unittest integration
unittest:
echo 'Running unittest'
TYPE= nosetests
integration:
echo 'Running integration test'
TYPE=integration nosetests
Incidentally, you should not use upper case for your own variables; these names are reserved for system use.

Related

How do I check that a single variable has multiple parameters populated

In a Dockerfile I have
ENV JAVA_OPTS="-DmyKeyStore=${blah1} -DmyApi=${blah2} -Dsalt=${blah3}"
The ${blah} variables are populated during our CI/CD run but I want the Docker build to fail if one of the parameters fails to get populated.
I can use the below code to check if ENV JAVA_OPTS as a whole isn't populated.
RUN if [ -z "$JAVA_OPTS" ]; then echo 'Environment variable JAVA_OPTS must be specified. Exiting.'; exit 1; fi
but I want to do a deeper check within that variable and fail if -DmyKeyStore=blank/null for example.
Are you sure docker uses bash, and not something internal? If it uses bash, then you could always use the ${param:?message} parameter expansion to generate an error.
Something like this:
JAVA_OPTS="-DmyKeyStore=${blah1:?} -DmyApi=${blah2:?} -Dsalt=${blah3:?}"
You can test it in an interactive shell like this:
$ : ${t:?}
bash: t: parameter null or not set
$ t=1
$ : ${t:?}

Environment variables apparently not being passed to a systemd service invocation

Here is the case:
I am writing a go program.
At some point, the program calls terragrunt cli, via os.Exec().
The program is run on a machine having systemd version 232.
Up till know, I have been invoking terragrunt with some env vars exposed (required by terragrunt as we will see below)
These env vars are passed to the login process by /etc/profile.d/terragruntvars as in
export TF_VAR_remote_state_bucket=my-bucket-name
So when I run in my terminal say terragrunt plan and by the appropriate interpolation in my tf / hcl files, I get something like (this is a debug level output, showing the actual terraform invocation terragrunt ends up performing)
terraform init -backend-config=my-bucket-name ...(more flags following)
My go program (invoking terragrunt cli via os.Exec()) runs perfectly via go run main.go
I decide to make this a systemd service as in
[Service]
ExecStart=/bin/sh -c myprogram
EnvironmentFile=/etc/myprogram/config
User=someuser
Group=somegroup
[Install]
WantedBy=multi-user.target
The program started failing miserably. By searching the root case I found out that the TF_VAR_* variables where never passed to the service when running, so the terraform command ended up being like
terraform init -backend-config=(this is empty, nothing here)
I thought that by explicitly invoking the service via bash, i.e. by making ExecStart=/bin/sh -c myprogram this would address the problem.
Here come the weird(est) parts.
Adding these vars to EnvironmentFile=/etc/myprogram/config did not have any effect in the terragrunt execution. When I say no effect, I mean the variables did become available to the service, however the command is still broken, i.e.
terraform init -backend-config=(this is empty, nothing here)
However, the TF_VAR_* variables ARE there. I added an os.Exec("env") in my program and it did print them.
This has been driving me nuts so any hint about what might be causing this would be highly appreciated.
Just like a shell will not pass it's process ENV VAR's on to child processes:
$ X=abc
$ bash -c 'echo $X' # prints nothing
unless you export the environment variable:
$ export X
$ bash -c 'echo $X' # abc
similarly with systemd and when using EnvironmentFile, to export environment variables, use PassEnvironment e.g.
PassEnvironment=VAR1 VAR2 VAR3
From the docs:
PassEnvironment=
Pass environment variables set for the system service manager to executed processes.
Takes a space-separated list of variable names...

How to set application credential details in CI?

How to set application credential details in buildkite so that it can be used as part of tests?
Any help?
thanks in Advance
The easiest way is to store them in an agent environment hook, which is a script file you need to put on the host running the agent, and is invoked just before every job the agent runs:
# /etc/buildkite-agent/hooks/environment
set -eu
echo "--- :house_with_garden: Setting up the environment"
export APPLICATION_PASSWORD="xxx"
and then use them in your pipeline commands from the environment:
# .buildkite/pipeline.yml
steps:
- label: Run tests
command: ./run-tests --password="$$APPLICATION_PASSWORD"
The double-dollar escapes the variable for pipeline upload, making sure that the password is not interpolated into the YAML and then submitted to buildkite.com. It will then be interpolated once the agent runs the command.
You could also access $APPLICATION_PASSWORD within your script to avoid mentioning it in the yaml at all.
The agent environment hook works best if you're running long lived agents, or use something like the elastic-ci-stack-for-aws which has a shared environment hook for this sort of thing:
https://github.com/buildkite/elastic-ci-stack-for-aws#build-secrets
but there are a few other options, too:
https://buildkite.com/docs/pipelines/secrets

How to set env variables which contain a command using EnvInject plugin of Jenkins

I have installed the EnvInject plugin of Jenkins.
I add it in the properties content (In script content also doesn't work: echo's nothing)
I able to set environment variable e.g.:
TEST="hello world"
In shell:
echo ${TEST}
Output: Hello World
But when I try to put the output of a command in my variable it doesn't work:
HOSTNAME=`hostname`
In Shell
echo ${HOSTNAME}
Output: `hostname`
While when I set the environment variable in my shell (without the plugin it works):
In Shell
HOSTNAME=`hostname`
echo ${HOSTNAME}
Output: localhost
From job configuration you should use Inject environment variables to the build process / Evaluated Groovy script.
Depending on the configuration you could execute command and save it in map containing environment variables:
return [HOSTNAME: 'hostname'.execute().text]
or run Groovy equivalent:
return [HOSTNAME: java.net.InetAddress.getLocalHost().getHostName()]

How to run regression using makefile

I have tcsh shell. I want to compile once which is VCS and then run multiple testcases using SIMV. Earlier for single test VCS = vcs -sverilog -timescale=1ns/1ps \ +acc +vpi .. and SIMV = ./simv +UVM_VERBOSITY=$(UVM_VERBOSITY) +UVM_TESTNAME=$(TESTNAME) ${vcs_waves_cmd} -l $(TESTNAME).log were defined as constants.
I have to replace $(TESTNAME) by looping on an array.I tried as below by switching to bash but ultimately it is causing other failures such as make cleannot working.
TESTS = ext_reg_write_read reg_write_read
regress: $(TESTS)
$(VCS)\
for t in $(TESTS); do\
./simv +UVM_VERBOSITY=$(UVM_VERBOSITY) +UVM_TESTNAME=$$t ${vcs_waves_cmd} -l $$t.log;\
done
Also I would like to add export shell command export SHELL = /bin/csh -f
My question is similar to following – Implementing `make check` or `make test`
I have used #J. C. Salomon 's answer to make this code
The problem is with export SHELL = /bin/csh -f which I was changing to export SHELL = /bin/bash -f.
But finally SHELL := /bin/bash works as answered in How can I use Bash syntax in Makefile targets? by #derobert

Resources