How to add shell expansions inside a makefile? - makefile

For example:
SHELL=/bin/bash
ex:
echo $RANDOM
When you invoke it:
$ make ex
echo ANDOM
ANDOM
What is happening there? Is there a way to fix it?

Make interprets the $ sign as its own variable (R, in your case). You need to escape it:
SHELL=/bin/bash
ex:
echo $$RANDOM

$(RANDOM) If I remember correctly is the correct makefile syntax
Edit: $(RANDOM) is for Makefile variables.
If you have an exported shell variable, you'll need to use ${RANDOM}

Related

bash variables within variables

I wrote a little bash script to make a "shell" that prints the date before and after a command is executed:
while true
do date
printf "Prompt: "
read x
date
$x #Solution: this should be eval "$x"
done
The problem is that this "shell" doesn't recognize variables in $x, so for instance echo $PWD outputs $PWD instead of the current directory. A related consequence is that cd does not work. How to fix/work around this? Thanks.
EDIT: Solved. Thanks everyone :-)
LATE EDIT: Actually the above isn't quite the solution; I needed to add the -r option to the read command, to prevent backslash sequences from being prematurely evaluated. Furthermore, the double quotes appear to be superfluous after all in this case (since the prepass would expand $x and "$x" to the same thing unless $x is empty, and eval does the same thing with no argument as it does with an empty string), but I suppose it never hurts to always include the quotes as a matter of security and good style.
If you're just trying to print out the date before and after, you can do that with PROMPT_COMMAND="$(date)" in your .bashrc. This will print out the date before your command prompt (meaning that it will also functionally be after your command since you'll get a prompt after the command finishes). It will be on a separate line from your PS1, but you could add this to your PS1 instead if you'd rather it be there.
If you're trying to type in a literal echo $PWD and have x give the value of that, you can do eval "$x" but most people will recommend against using eval because anyone can type anything, like sudo rm -rf / --no-preserve-root
If you're trying to set x to be the name of a variable, i.e., x=PWD, and you want to get the value of PWD, you can do ${!x}. This is called "indirection".

Function and difference in make between $ and $$

In the Makefile, there is sometimes the notation $ and sometimes it is $$. When should those be used?
$$ is used in Makefiles to escape the $ e.g. you have a variable in the environment of the shell you want to use
#echo "Building in $$PWD"

Evaluate Unix Variable with multiple parts

How can I do the following in Unix:
1) Variable xxx_yyy=12345
2) Variable aaa=yyy
How can I evaluate xxx_$aaa to give me 12345.
Please advise.
In bash use indirect parameter expansion:
varname=xxx_$aaa
echo ${!varname}
However, dynamic variable names are usually tricky to handle. Easier to use an associative array:
declare -A xxx
xxx[yyy]=12345
aaa=yyy
echo ${xxx[$aaa]}
In bash (and probably others: specify if you want another), use eval. Don't forget to escape the first $ as shown:
eval echo \$xxx_$aaa

Setting an environment variable before a command in Bash is not working for the second command in a pipe

In a given shell, normally I'd set a variable or variables and then run a command. Recently I learned about the concept of prepending a variable definition to a command:
FOO=bar somecommand someargs
This works... kind of. It doesn't work when you're changing a LC_* variable (which seems to affect the command, but not its arguments, for example, [a-z] char ranges) or when piping output to another command thusly:
FOO=bar somecommand someargs | somecommand2 # somecommand2 is unaware of FOO
I can prepend somecommand2 with FOO=bar as well, which works, but which adds unwanted duplication, and it doesn't help with arguments that are interpreted depending on the variable (for example, [a-z]).
So, what's a good way to do this on a single line?
I'm thinking something on the order of:
FOO=bar (somecommand someargs | somecommand2) # Doesn't actually work
I got lots of good answers! The goal is to keep this a one-liner, preferably without using export. The method using a call to Bash was best overall, though the parenthetical version with export in it was a little more compact. The method of using redirection rather than a pipe is interesting as well.
FOO=bar bash -c 'somecommand someargs | somecommand2'
How about exporting the variable, but only inside the subshell?:
(export FOO=bar && somecommand someargs | somecommand2)
Keith has a point, to unconditionally execute the commands, do this:
(export FOO=bar; somecommand someargs | somecommand2)
Use env.
For example, env FOO=BAR command. Note that the environment variables will be restored/unchanged again when command finishes executing.
Just be careful about about shell substitution happening, i.e. if you want to reference $FOO explicitly on the same command line, you may need to escape it so that your shell interpreter doesn't perform the substitution before it runs env.
$ export FOO=BAR
$ env FOO=FUBAR bash -c 'echo $FOO'
FUBAR
$ echo $FOO
BAR
You can also use eval:
FOO=bar eval 'somecommand someargs | somecommand2'
Since this answer with eval doesn't seem to please everyone, let me clarify something: when used as written, with the single quotes, it is perfectly safe. It is good as it will not launch an external process (like the accepted answer) nor will it execute the commands in an extra subshell (like the other answer).
As we get a few regular views, it's probably good to give an alternative to eval that will please everyone, and has all the benefits (and perhaps even more!) of this quick eval “trick”. Just use a function! Define a function with all your commands:
mypipe() {
somecommand someargs | somecommand2
}
and execute it with your environment variables like this:
FOO=bar mypipe
A simple approach is to make use of ;
For example:
ENV=prod; ansible-playbook -i inventories/$ENV --extra-vars "env=$ENV" deauthorize_users.yml --check
command1; command2 executes command2 after executing command1, sequentially. It does not matter whether the commands were successful or not.
Use a shell script:
#!/bin/bash
# myscript
FOO=bar
somecommand someargs | somecommand2
> ./myscript

Bash "-e" puzzler

I’m trying to build a command string based to pass in a “-e” flag and another variable into a another base script being call as a subroutine and have run into a strange problem; I’m losing the “-e” portion of the string when I pass it into the subroutine. I create a couple example which illustrate the issue, any help?
This works as you would expect:
$echo "-e $HOSTNAME"
-e ops-wfm
This does NOT; we lose the “-e” because it is interpreted as a special qualifier.
$myFlag="-e $HOSTNAME"; echo $myFlag
ops-wfm
Adding the “\” escape charactor doesn’t work either, I get the correct string with the "\" in front:
$myFlag="\-e $HOSTNAME"; echo $myFlag
\-e ops-wfm
How can I prevent -e being swallowed?
Use double-quotes:
$ myFlag="-e $HOSTNAME"; echo "${myFlag}"
-e myhost.local
I use ${var} rather than $var out of habit as it means that I can add characters after the variable without the shell interpreting them as part of the variable name.
echo may not be the best example here. Most Unix commands will accept -- to mark no more switches.
$ var='-e .bashrc' ; ls -l -- "${var}"
ls: -e .bashrc: No such file or directory
Well, you could put your variable in quotes:
echo "$myFlag"
...making it equivalent to your first example, which, as you say, works just fine.

Resources