Makefile: Save a variable during execution time - makefile

I'm using Makefiles with "make" for a lot of things like starting / stopping / configuring services I've written. Sometimes I'd like to read an input from the user. The only ways I know are either make the user pass his input with NAME=VALUE when executing make, or by putting a command like read -p "setting X: " var ; echo $$var into the Makefile.
NAME=VALUE has the disadvantage that the user must manually set it and I can't "ask" him to enter a value. read has the disadvantage that the read value can not (or I don't know how) be saved in a variable and so it can't be used multiple times.
Is there a way to read user input into a variable during executing a specific makefile target? (I don't want to put FILE ?= 'read -p "value: " var ; echo $$var' in the header because the value is only needed for one target, and when I put that line in the target itself, I get the error "/bin/bash: FILE: Command not found. ".

I use intermediate files for this purpose.
INPUT = dialog --inputbox 80 10 10
all: case1 case2
case1: read-input
echo $(shell cat read-input) in case 1
case2: read-input
echo $(shell cat read-input) in case 2
.INTERMEDIATE: read-input
read-input:
$(INPUT) 2>$#

Related

Looping over a string list and making values lowercase

I have a Makefile, trying to loop over a series of strings in a recipe and make them lower case.
My goal: I have a series of commands I would like to run on different files with the appropriate suffix.
# files in directory: test_file1.txt test_file2.txt test_file3.txt
MYLIST = file1 file2 file3
recipe:
for name in $(MYLIST) ; do \
$(eval FILENAME=`echo $($name) | tr A-Z a-z`) \
echo "Final name : test_${FILENAME}.txt" ; \
done
My problem, FILENAME always resolves to blank:
File name: test_.txt
I hope to see:
File name: test_file1.txt
File name: test_file2.txt
File name: test_file3.txt
You cannot mix make functions and shell commands like this. Make works like this: when it decides that your target is out of date and it wants to run the recipe, first it will expand the entire recipe string. Second it sends that expanded string to the shell to run.
So in your case, the $(eval ...) (which is a make operation) is expanded one time, then the resulting string is passed to the shell to run. The shell runs the for loop and all that stuff.
You have to use shell variables here to store values obtained by running your shell for loop. You cannot use make variables or make functions.
In general if you ever think about using $(shell ...) or $(eval ...) inside a recipe, you are probably going down the wrong road.

Makefile: use shell command with generated content as argument

I have a make target, that first calls a CAE tool which generates reports. When this is done, the target calls a python script that shall take the content of the CAE reports (or more specific some grep'ed lines of the reports) as argument.
A minimum example is
target1:
date > ./bar.txt
echo $(shell cat ./bar.txt)
Problem is, that make expands the $(shell cat ./bar.txt) before the first command has been called and bar.txt has been updated. So for this minimum example, the echo prints the content of bar.txt before the update (the date from the previous target run).
(
I know that I simply could write this example in another way without variables and the shell function call, this is just for the sake of showing the problem where I call a tool that takes an argument from a shell call. So actually I want to do sth like this:
target1:
cae_tool_call
report_eval.py -text "$(shell cat $(generated_report) | grep 'foo')"
where cae_tool_call generates the generated_report. And this -text "argument" does not resolve the argument without an explicit call of the shell function.
)
I already tried with actual shell variables (instead of make variables), double escapes, immediate vs deferred variables but have no working solution yet. Any ideas?
#######################################
Edit to show some unexpected behavior:
I have this python script argument_example.py
import argparse
def main():
parser = argparse.ArgumentParser()
parser.add_argument("-r", "--reporttext", help="text string", required=True)
args=parser.parse_args()
if args.reporttext:
print(args.reporttext)
main()
It just prints the text given with argument -r.
And I have these two make targets:
####################################
#this does not work
REPORTNAME := ./bar.txt
variable_report:
date > $REPORTNAME
python3 ./argument_example.py --reporttext "`(cat $REPORTNAME)`"
####################################
#this works
static_report:
date > ./bar.txt
python3 ./argument_example.py --reporttext "`(cat ./bar.txt)`"
When calling variable_report, the python scripts prints the outdated bar.txt content. When calling static_report, the python script prints the updated content.
make recipes are already shell scripts. Never use the shell make function inside a recipe. In your first simple example use:
target1:
date > bar.txt
cat bar.txt
In your other example use:
generated_report := name-of-generated-report
target1:
cae_tool_call
report_eval.py -text "`cat $(generated_report) | grep 'foo'`"
Or even better:
generated_report := name-of-generated-report
target1:
cae_tool_call
report_eval.py -text "`grep 'foo' $(generated_report)`"

How to source a variable list pulled using sqlplus in bash without creating a file

Im trying to source a variable list which is populated into one single variable in bash.
I then want to source this single variable to the contents (which are other variables) of the variable are available to the script.
I want to achieve this without having to spool the sqlplus file then source this file (this already works as I tried it).
Please find below what Im trying:
#!/bin/bash
var_list=$(sqlplus -S /#mydatabase << EOF
set pagesize 0
set trimspool on
set headsep off
set echo off
set feedback off
set linesize 1000
set verify off
set termout off
select varlist from table;
EOF
)
#This already works when I echo any variable from the list
#echo "$var_list" > var_list.dat
#. var_list.dat
#echo "$var1, $var2, $var3"
#Im trying to achieve the following
. $(echo "var_list")
echo "$any_variable_from_var_list"
The contents of var_list from the database are as follows:
var1="Test1"
var2="Test2"
var3="Test3"
I also tried sourcing it other ways such as:
. <<< $(echo "$var_list")
. $(cat "$var_list")
Im not sure if I need to read in each line now using a while loop.
Any advice is appreciated.
You can:
. /dev/stdin <<<"$varlist"
<<< is a here string. It redirects the content of data behind <<< to standard input.
/dev/stdin represents standard input. So reading from the 0 file descriptor is like opening /dev/stdin and calling read() on resulting file descriptor.
Because source command needs a filename, we pass to is /dev/stdin and redirect the data to be read to standard input. That way source reads the commands from standard input thinking it's reading from file, while we pass our data to the input that we want to pass.
Using /dev/stdin for tools that expect a file is quite common. I have no idea what references to give, I'll link: bash manual here strings, Posix 7 base definitions 2.1.1p4 last bullet point, linux kernel documentation on /dev/ directory entires, bash manual shell builtins, maybe C99 7.19.3p7.
I needed a way to store dotenv values in files locally and vars for DevOps pipelines, so I could then source to the runtime environment on demand (from file when available and vars when not). More though, I needed to store different dotenv sets in different vars and use them based on the source branch (which I load to $ENV in .gitlab-ci.yml via export ENV=${CI_COMMIT_BRANCH:=develop}). With this I'll have developEnv, qaEnv, and productionEnv, each being a var containing it's appropriate dotenv contents (being redundant to be clear.)
unset FOO; # Clear so we can confirm loading after
ENV=develop; #
developEnv="VERSION=1.2.3
FOO=bar"; # Creating a simple dotenv in a var, with linebreak (as the files passed through will have)
envVarName=${ENV}Env # Our dynamic var-name
source <(cat <<< "${!envVarName}") # Using the dynamic name,
echo $FOO;
# bar

Source configuration file avoiding any execution

I'm working on a program to process requests in bash which are requested by users in a WebInterface. To give users flexibility they can specify several parameters per each job, at the end the request is saved in a file with a specific name, so the bash script could perform the requested task.
This file at the end is filled like this:
ENVIRONMENT="PRO"
INTEGRATION="J050_provisioning"
FILE="*"
DIRECTORY="out"
So the bash script will source this file to perform the needed tasks user requested. And it works great so far, but I see a security issue with this, if user enters malicious data, something like:
SOMEVAR="GONNAHACK $(rm -f some_important_file)"
OTHERVAR="DANGEROUZZZZZZ `while :; do sleep 10 & done`"
This will cause undesirable effects when sourcing the file :). Is there a way to prevent a source file execute any code but variable initializations? Or the only way would be just grep the source file before sourcing it to check it is not dangerous?
Just do not source it. Make it a configuration file composed of name=value lines (without the double quotes), read each name/value pair and assign value to name. In order not to overwrite critical variables like PATH, prefix the name with CONF_ for example.
Crude code:
while IFS='=' read -r conf_name conf_value; do
printf -v "CONF_$conf_name" '%s' "$conf_value" \
|| echo "Invalid configuration name '$conf_name'" >&2
done < your_configuration_file.conf
Test it works:
$ echo "${!CONF_*}"
CONF_DIRECTORY CONF_ENVIRONMENT CONF_FILE CONF_INTEGRATION CONF_OTHERVAR CONF_SOMEVAR
$ printf '%s\n' "$CONF_SOMEVAR"
GONNAHACK $(rm -f some_important_file)

Create a file from a large Makefile variable

I have a list of objects in a Makefile variable called OBJECTS which is too big for the command buffer. Therefore I'm using the following method to create a file listing the objects (to pass to ar):
objects.lst:
$(foreach OBJ,$(OBJECTS),$(shell echo "$(OBJ)">>$#))
While this works it is extremely slow (on Cygwin at least) and I don't like relying on shell commands and redirection.
Additionlly foreach is not intended for this purpose - it is evaluated before any commands are run which means I can't for example rm -f objects.lst before appending.
Is there a better way? I don't want to use incremental archiving as that causes problems with multiple jobs.
The only thing I can think of is parsing the Makefile with a separate script to read the object list or storing the object list in a separate file. Both solutions have their own problems though.
Try something like:
OBJECTS:=a b c d
objects.lst:
echo > $# <<EOF $(OBJECTS)
i.e. make use of the <<EOF functionality that is built into the shell. It does not have any max-length limitations.
In the following example I also replaced echo with a simple Perl script to split the arguments onto new lines but this is the jist of it..
objects.lst:
echo $(wordlist 1,99,$(OBJECTS))>$#
echo $(wordlist 100,199,$(OBJECTS))>>$#
echo $(wordlist 200,299,$(OBJECTS))>>$#
echo $(wordlist 300,399,$(OBJECTS))>>$#
...
How about something like this:
OBJECTS_AM=$(filter a% b% c% d% e% f% g% h% i% j% k% l% m%,$(OBJECTS))
OBJECTS_NZ=$(filter-out a% b% c% d% e% f% g% h% i% j% k% l% m%,$(OBJECTS))
objects.lst:
$(shell echo "$(OBJECTS_AM)">$#)
$(shell echo "$(OBJECTS_NZ)">>$#)
You might need to split it one or two more times, but it's not that bad, especially as the distribution of file names doesn't change all that often.
Here's a patch to gnu make that lets you directly write a variable into a file.
It creates a new 'writefile' function, similar to the existing 'info' function, except it takes a filename argument and writes to the file:
https://savannah.gnu.org/bugs/?35384

Resources