Variable substitution in a for-loop using {$var} - bash

I'm very new to bash scripting and I'm trying to practice by making this little script that simply asks for a range of numbers. I would enter ex. 5..20 and it should print the range, however - it just echo's back whatever I enter ("5..20" in this example) and does not expand the variable. Can someone tell me what I'm doing wrong?
Script:
echo -n "Enter range of number to display using 0..10 format: "
read range
function func_printrage
{
for n in {$range}; do
echo $n
done
}
func_printrange

Brace expansion in bash does not expand parameters (unlike zsh)
You can get around this through the use of eval and command substitution $()
eval is evil because you need to sanitize your input otherwise people can enter ranges like rm -rf /; and eval will run that
Don't use the function keyword, it is not POSIX and has been deprecated
use read's -p flag instead of echo
However, for learning purposes, this is how you would do it:
read -p "Enter range of number to display using 0..10 format: " range
func_printrange()
{
for n in $(eval echo {$range}); do
echo $n
done
}
func_printrange
Note: In this case the use of eval is OK because you are only echo'ing the range

One way is to use eval,
crude example,
for i in $(eval echo {0..$range}); do echo $i; done
the other way is to use bash's C style for loop
for((i=1;i<=20;i++))
do
...
done
And the last one is more faster than first (for example if you have $range > 1 000 000)

One way to get around the lack of expansion, and skip the issues with eval is to use command substitution and seq.
Reworked function (also avoids globals):
function func_print_range
{
for n in $(seq $1 $2); do
echo $n
done
}
func_print_range $start $end

Use ${} for variable expansion. In your case, it would be ${range}. You left off the $ in ${}, which is used for variable expansion and substitution.

Related

executing variable substitution/String manipultion N times

Repeating same command n times. Question has been asked before
But methods in that question are not working for variable assignments
Eg.
var='abc,xyz,mnx,duid'
for f in `seq 3`; do var=${var%,*}; done
Above works but using it in function as described in other question does't work
Eg.
repeat() { num="$"; shift; for f in $(seq $num); do $1; done; }
repeat 3 'var=${var%,*}'
repeat 3 "var=${var%,*}"
Doesn't work
Based on your sample data
var='abc,xyz,mnx,duid'
you can also have the same effect by concatenating the search terms multiple times
var=${var%,*,*,*}
That said, you could also do things like
var=${var%$(printf ',*%.0s' {1..3})}
or
n=3; var=${var%$(printf ',*%.0s' $(seq $n))}
Variable expansions and assignments aren't processed after expanding variables. You need to use eval to re-execute it as a shell command.
You also have a typo, "$" should be "$1".
repeat() { num="$1"; shift; for f in $(seq $num); do eval "$1"; done; }
repeat 3 'var=${var%,*}'
echo "$var" # output = abc
Note that the argument needs to be in single quotes, otherwise the variable expansion will be processed once before calling the function.

Comparing SHA-512 Salted Hashes, displays filename during execution

Long story short during execution, i noticed that the "*" in the LIST somehow displays the name of the file i was executing. Is there anyway to let the code display "*" instead of displaying the filename ?
I am still a script newbie, could not think of any way. Please help ...
#!/bin/bash
LIST='W 2 v * %'
encr="Cd9AjUI4nGglIcP3MByrZUnu.hHBJc7.eR0o/v0A1gu0/6ztFfBxeJgKTzpgoCLptJS2NnliZLZjO40LUseED/"
salt="8899Uidd"
for i in $LIST
do
for j in $LIST
do
for k in $LIST
do
for l in $LIST
do
for a in $LIST
do
echo -n "$i$j$k$l$a "
test=`mkpasswd -m SHA-512 $i$j$k$l$a -s $salt`
if [ $test == $encr ] ; then
echo " Password is: $i$j$k$l$a"
exit
fi
done
done
done
done
done
#error displaying *
The same as echo * expands the globulation * to the filenames in the current directory, the same way a=*; for i in $a; do echo $i; done will print paths in the current directory. You can read more about quotes and escaping at various places in the net.
You can use bash arrays to correctly quotes elements:
LIST=(W 2 v "*" %)
for i in "${LIST[#]}"; do
Notes:
Don't use backticks `, they are discouraged. Use $(..) instead.
Quote your variables expansions [ "$test" = "$encr" ] to protect from common bugs like this.
The == is a bash extension, use = to be portable.
The method for generating permutations of elements looks strange and is not scalable. Consider writing a function or using other method. Even (ab-)using bash extension brace expansion echo {W,2,v,"*",%}{W,2,v,"*",%}{W,2,v,"*",%}{W,2,v,"*",%}{W,2,v,"*",%} looks shorter.

Set a parent shell's variable from a subshell

How do I set a variable in the parent shell, from a subshell?
a=3
(a=4)
echo $a
The whole point of a subshell is that it doesn't affect the calling session. In bash a subshell is a child process, other shells differ but even then a variable setting in a subshell does not affect the caller. By definition.
Do you need a subshell? If you just need a group then use braces:
a=3
{ a=4;}
echo $a
gives 4 (be careful of the spaces in that one). Alternatively, write the variable value to stdout and capture it in the caller:
a=3
a=$(a=4;echo $a)
echo $a
avoid using back-ticks ``, they are deprecated and can be difficult to read.
There is the gdb-bash-variable hack:
gdb --batch-silent -ex "attach $$" -ex 'set bind_variable("a", "4", 0)';
although that always sets a variable in the global scope, not just the parent scope
You don't. The subshell doesn't have access to its parent's environment. (At least within the abstraction that Bash provides. You could potentially try to use gdb, or smash the stack, or whatnot, to gain such access clandestinely. I wouldn't recommend that, though.)
One alternative is for the subshell to write assignment statements to a temporary file for its parent to read:
a=3
(echo 'a=4' > tmp)
. tmp
rm tmp
echo "$a"
If the problem is related to a while loop, one way to fix this is by using Process Substitution:
var=0
while read i;
do
# perform computations on $i
((var++))
done < <(find . -type f -name "*.bin" -maxdepth 1)
as shown here: https://stackoverflow.com/a/13727116/2547445
To change variables in a script called from a parent script, you can call the script preceded with a "."
(EDIT - for explanation)
In most shells "." is an alias for "source". the source command just inserts the text of another file at that position in the executing script. In the context of this question this answer avoids a sub-shell
a=3
echo $a
. ./calledScript.sh
echo $a
in calledScript.sh
a=4
Expected output
3
4
By reading the answer from #ruakh (thank you) with a temporary file approach and the comments asking for a file descriptors solution, I got the following idea:
a=3
. <(echo a=4; echo b=5)
echo $a
echo $b
It allows returning different variables at once (which could be an issue in the subshell variant of the accepted answer).
No iteration is needed,
No temporary file to take care of.
Close to the syntax proposed by the OP.
Result:
4
5
With xtrace enabled is visible that we are sourcing from the file descriptor created for the output of the subshell:
+ a=3
+ . /dev/fd/63 # <-- the file descriptor ;)
++ echo a=4
++ echo b=5
++ a=4
++ b=5
+ echo 4
4
+ echo 5
5
You can output the value in the subshell and assign the subshell output to a variable in the caller script:
# subshell.sh
echo Value
# caller
myvar=$(subshell.sh)
If the subshell has more to output you can separate the variable value and other messages by redirecting them into different output streams:
# subshell.sh
echo "Writing value" 1>&2
echo Value
# caller
myvar=$(subshell.sh 2>/dev/null) # or to somewhere else
echo $myvar
Alternatively, you can output variable assignments in the subshell, evaluate them in the caller script and avoid using files to exchange information:
# subshell.sh
echo "a=4"
# caller
# export $(subshell.sh) would be more secure, since export accepts name=value only.
eval $(subshell.sh)
echo $a
The last way I can think of is to use exit codes but this covers the integer values exchange only (and in a limited range) and breaks the convention for interpreting exit codes (0 for success non-0 for everything else).
Instead of accessing the variable from the parent shell, change the order of the commands and use the process substitution:
a=3
echo 5 | (read a)
echo $a
prints 3
a=3
read a < <(echo 5)
echo $a
prints 5
Another example:
let i=0
seq $RANDOM | while read r
do
let i=r
done
echo $i
vs
let i=0
while read r
do
let i=r
done < <(seq $RANDOM)
echo $i
Alternatively, when job control is inactive (e.g. in scripts) you can use the lastpipe shell option to achieve the same result without changing the order of the commands:
#!/bin/bash
shopt -s lastpipe
let i=0
seq $RANDOM | while read r
do
let i=r
done
echo $i
Unless you can apply all io to pipes and use file handles, basic variable updating is impossible within $(command) and any other sub-process.
Regular files, however, are bash's global variables for normal sequential processing. Note: Due to race conditions, this simple approach is not good for parallel processing.
Create an set/get/default function like this:
globalVariable() { # NEW-VALUE
# set/get/default globalVariable
if [ 0 = "$#" ]; then
# new value not given -- echo the value
[ -e "$aRam/globalVariable" ] \
&& cat "$aRam/globalVariable" \
|| printf "default-value-here"
else
# new value given -- set the value
printf "%s" "$1" > "$aRam/globalVariable"
fi
}
"$aRam" is the directory where values are stored. I like it to be a ram disk for speed and volatility:
aRam="$(mktemp -td $(basename "$0").XXX)" # temporary directory
mount -t tmpfs ramdisk "$aRam" # mount the ram disk there
trap "umount "$aRam" && rm -rf "$aRam"" EXIT # auto-eject
To read the value:
v="$(globalVariable)" # or part of any command
To set the value:
globalVariable newValue # newValue will be written to file
To unset the value:
rm -f "$aRam/globalVariable"
The only real reason for the access function is to apply a default value because cat will error given a non-existent file. It is also useful to apply other get/set logic. Otherwise, it would not be needed at all.
An ugly read method avoiding cat's non-existent file error:
v="$(cat "$aRam/globalVariable 2>/dev/null")"
A cool feature of this mess is that you can open another terminal and examine the contents of the files while the program is running.
While it's harder to get multiple variables out of a subshell, you can set multiple variables inside a function without using globals.
You can pass the name of a variable into a function that uses local -n to turn it into a special variable called a nameref:
myfunc() {
local -n OUT=$1
local -n SIDEEFFECT=$2
OUT='foo'
SIDEEFFECT='bar'
}
myfunc A B
echo $A
> foo
echo $B
> bar
This is the technique I ended up using instead of getting subshell FOO=$(myfunc) working setting multiple variables.
A very simple and practical method that allows multiple variables is as follows, eventually may add parameters to the call:
function ComplexReturn(){
# do your processing...
a=123
b=456
echo -n "AAA=${a}; BBB=${b};"
}
# ... this can be internal function or any subshell command
eval $(ComplexReturn)
echo $AAA $BBB

What is the problem with my code for multiplying two numbers in a different way using Bash?

This is my first time using Bash, so bear with me. I'm using Git Bash in Windows for a college project, trying to rewrite some C code that provides an alternate way of multiplying two numbers "a" and "b" to produce "c". This is what I came up with:
#!/bin/bash
declare -i a
declare -i b
declare -i c=0
declare -i i=0 # not sure if i have to initialize this as 0?
echo "Please enter a number: "
read a
echo "Please enter a number: "
read b
for i in {1..b}
do
let "c += a"
done
echo "$a x $b = $c"
I think part of the problem is in the for loop, that it only executes once. This is my first time using Bash, and if anyone could find the fault in my knowledge, that would be all I need.
There are problems with your loop:
You can't use {1..b}. Even if you had {1..$b} it wouldn't work because you would need an eval. It's easiest to use the seq command instead.
Your let syntax is incorrect.
Try this:
for i in $(seq 1 $b)
do
let c+=$a
done
Also, it's not necessary to declare or initialise i.
for i in {1..b}
won't work, because 'b' isn't interpreted as a variable but a character to iterate to.
For instance {a..c} expands to a b c.
To make the brace expansion work:
for i in $(eval echo "{1..$b}")
The let "c += a" won't work either.
let c+=$a might work, but I like ((c+=a)) better.
Another way is this:
for ((i = 1; i <= b; i++))
do
((c += a))
done
(might need to put #!/bin/bash at the top of your script, because sh does less than bash.)
Of course, bash already has multiplication, but I guess you knew that ...
If the absence of "seq" is your issue, you can replace it with something more portable, like
c=0
# Print an endless sequence of lines
yes |
# Only take the first $b lines
head -n "$b" |
# Add line number as prefix for each line
nl |
# Read the numbers into i, and the rest of the line into a dummy variable
while read i dummy; do
# Update the value of c: add line number
c=`expr "$c" + "$i"`
echo "$c"
done |
# Read the last number from the while loop
tail -n 1
This should be portable to any Bourne-compatible shell. The while ... echo ... done | tail -n 1 trick is necessary only if the value of c is not exported outside the while loop, as is the case in some, but not all, Bource shells.
You can implement a seq replacement with a Perl one-liner, but then you might as well write all of this in Perl (or awk, or Python, or what have you).

lambda functions in bash

Is there a way to implement/use lambda functions in bash? I'm thinking of something like:
$ someCommand | xargs -L1 (lambda function)
I don't know of a way to do this, however you may be able to accomplish what you're trying to do using:
somecommand | while read -r; do echo "Something with $REPLY"; done
This will also be faster, as you won't be creating a new process for each line of text.
[EDIT 2009-07-09]
I've made two changes:
Incorporated litb's suggestion of using -r to disable backslash processing -- this means that backslashes in the input will be passed through unchanged.
Instead of supplying a variable name (such as X) as a parameter to read, we let read assign to its default variable, REPLY. This has the pleasant side-effect of preserving leading and trailing spaces, which are stripped otherwise (even though internal spaces are preserved).
From my observations, together these changes preserve everything except literal NUL (ASCII 0) characters on each input line.
[EDIT 26/7/2016]
According to commenter Evi1M4chine, setting $IFS to the empty string before running read X (e.g., with the command IFS='' read X) should also preserve spaces at the beginning and end when storing the result into $X, meaning you aren't forced to use $REPLY.
if you want true functions, and not just pipes or while loops (e.g. if you want to pass them around, as if they were data) I’d just not do lambdas, and define dummy functions with a recurring dummy name, to use right away, and throw away afterwards. Like so:
# An example map function, to use in the example below.
map() { local f="$1"; shift; for i in "$#"; do "$f" "$i"; done; }
# Lambda function [λ], passed to the map function.
λ(){ echo "Lambda sees $1"; }; map λ *
Like in proper functional languages, there’s no need to pass parameters, as you can wrap them in a closure:
# Let’s say you have a function with three parameters
# that you want to use as a lambda:
# (As in: Partial function application.)
trio(){ echo "$1 Lambda sees $3 $2"; }
# And there are two values that you want to use to parametrize a
# function that shall be your lambda.
pre="<<<"
post=">>>"
# Then you’d just wrap them in a closure, and be done with it:
λ(){ trio "$pre" "$post" "$#"; }; map λ *
I’d argue that it’s even shorter than all other solutions presented here.
What about this?
somecommand | xargs -d"\n" -I{} echo "the argument is: {}"
(assumes each argument is a line, otherwise change delimiter)
#!/bin/bash
function customFunction() {
eval $1
}
command='echo Hello World; echo Welcome;'
customFunction "$command"
GL
Source
if you want only xargs (due parallel -P N option for example), and only bash as function code, then bash -c can be used as parameter for xargs.
seq 1 10 | tr '\n' '\0' | xargs -0 -n 1 bash -c 'echo any bash code $0'
tr and -0 option are used here to disable any xargs parameters substitutions.
Yes. One can pass around a string variable representing a command call, and then execute the command with eval.
Example:
command='echo howdy'
eval "$command"
The eval trick has been already mentioned but here's my extended example of bash closures:
#!/usr/bin/env bash
set -e
function multiplyBy() {
X="$1"
cat <<-EOF
Y="\$1"
echo "$X * \$Y = \$(( $X * \$Y ))"
EOF
}
function callFunc() {
CODE="$1"
shift
eval "$CODE"
}
MULT_BY_2=`multiplyBy 2`
MULT_BY_4=`multiplyBy 4`
callFunc "$MULT_BY_2" 10
callFunc "$MULT_BY_4" 10
PS I've just came up with this for a completely different purpose and was just searching google to see if sb is using that. I actually needed to evaluate a reusable function in the context (shell) of main script.

Resources