Why does the pre/absence of xargs affect the output from a subsequent echo command? - bash

Bash noob here, curious in the behavior of the following shell function (designed to quickly pack ascifiied hex sequences):
pack()
{
sedF=$(Extended_SED) # choose "-r"/"-E" for BSD/Linux flavors
IN=$(sed 's/%//g' <<< "$1") # allow "%41%41..."
if [ -z $(grep '\\x' <<< "$IN") ] # allow "4142..."
then
IN=$(sed -$sedF 's/(..)/\\\\x\1/g' <<< "$IN")
else # allow "\x41\x42..."
IN=$(sed -$sedF 's/\\/\\\\/g' <<< "$IN")
fi
echo "$IN" | xargs echo -en # pack through -e, suppress CR
# testing
echo "$IN"
echo -en "$IN" # doesn't pack
}
Both pack "\x41\x41" and pack "4141" produce the following output:
AA\\x41\\x41
\x41\x41
But when I run echo -e "\\x41\\x41" or echo -e "\x41\x41" from the command line, outside the script, I always get AA, which is puzzlingly different from the behavior of these commands when run inside the script.
So even though it's not a problem for the script, I am curious - (a) why, inside the script, is only the use of xargs leading to the desired output by echo -en? ; and (b) why does the behavior of non-xargs echo inside the script differ from that outside the script, i.e. when run from the command line?

Related

Printing bash command line args with $# inline

I want to add verbosity to my bash function by printing the command that it will run. What is the best way to print all arguments $# inline?
ggtest ()
{
echo 'git grep -n $# -- "src/tests/*"'
git grep -n "$#" -- "src/tests/*";
}
So that I can see an output such as:
$ ggtest "models and views"
git grep -n "models and views" -- "src/tests/*"
...
An overcomplicated version you can cut down to support only the specific shell releases you need support for:
ggtest ()
{
# note the following explicitly exits if run in a shell w/o array support
local -a cmd || return # declare a function-local array
cmd=( git grep -n "$#" -- "src/tests/*" ) # store intended command in array
# below here, we take a different approach based on running bash version
case $BASH_VERSION in
'') # no BASH_VERSION == we're running with a shell that's not bash at all
set -x # enable trace logging
: "${cmd[#]}" # run : with our array as arguments
{ set +x; } 2>/dev/null # silently disable tracing
;;
[1-4].*) # pre-5.0 bash does not support ${var#Q}; these logs are uglier
{ printf '%q ' "${cmd[#]}"; printf \n; } >&2 ;;
*) # modern bash; shortest syntax, prettiest output
printf '%s\n' "${cmd[*]#Q}" >&2;;
esac
"${cmd[#]}" # execute our array
}
Note that in current shell releases printf %q will use backslashes rather than quotes for escaping, so it would change ggtest "some string" to have some\ string in the logs; not the worst thing in the word, but it's less pretty than ${array[*]#Q}'s representation

Giving relative address as an input to read in bash scripts [duplicate]

I have a variable in my bash script whose value is something like this:
~/a/b/c
Note that it is unexpanded tilde. When I do ls -lt on this variable (call it $VAR), I get no such directory. I want to let bash interpret/expand this variable without executing it. In other words, I want bash to run eval but not run the evaluated command. Is this possible in bash?
How did I manage to pass this into my script without expansion? I passed the argument in surrounding it with double quotes.
Try this command to see what I mean:
ls -lt "~"
This is exactly the situation I am in. I want the tilde to be expanded. In other words, what should I replace magic with to make these two commands identical:
ls -lt ~/abc/def/ghi
and
ls -lt $(magic "~/abc/def/ghi")
Note that ~/abc/def/ghi may or may not exist.
If the variable var is input by the user, eval should not be used to expand the tilde using
eval var=$var # Do not use this!
The reason is: the user could by accident (or by purpose) type for example var="$(rm -rf $HOME/)" with possible disastrous consequences.
A better (and safer) way is to use Bash parameter expansion:
var="${var/#\~/$HOME}"
Due to the nature of StackOverflow, I can't just make this answer unaccepted, but in the intervening 5 years since I posted this there have been far better answers than my admittedly rudimentary and pretty bad answer (I was young, don't kill me).
The other solutions in this thread are safer and better solutions. Preferably, I'd go with either of these two:
Charle's Duffy's solution
Håkon Hægland's solution
Original answer for historic purposes (but please don't use this)
If I'm not mistaken, "~" will not be expanded by a bash script in that manner because it is treated as a literal string "~". You can force expansion via eval like this.
#!/bin/bash
homedir=~
eval homedir=$homedir
echo $homedir # prints home path
Alternatively, just use ${HOME} if you want the user's home directory.
Plagarizing myself from a prior answer, to do this robustly without the security risks associated with eval:
expandPath() {
local path
local -a pathElements resultPathElements
IFS=':' read -r -a pathElements <<<"$1"
: "${pathElements[#]}"
for path in "${pathElements[#]}"; do
: "$path"
case $path in
"~+"/*)
path=$PWD/${path#"~+/"}
;;
"~-"/*)
path=$OLDPWD/${path#"~-/"}
;;
"~"/*)
path=$HOME/${path#"~/"}
;;
"~"*)
username=${path%%/*}
username=${username#"~"}
IFS=: read -r _ _ _ _ _ homedir _ < <(getent passwd "$username")
if [[ $path = */* ]]; then
path=${homedir}/${path#*/}
else
path=$homedir
fi
;;
esac
resultPathElements+=( "$path" )
done
local result
printf -v result '%s:' "${resultPathElements[#]}"
printf '%s\n' "${result%:}"
}
...used as...
path=$(expandPath '~/hello')
Alternately, a simpler approach that uses eval carefully:
expandPath() {
case $1 in
~[+-]*)
local content content_q
printf -v content_q '%q' "${1:2}"
eval "content=${1:0:2}${content_q}"
printf '%s\n' "$content"
;;
~*)
local content content_q
printf -v content_q '%q' "${1:1}"
eval "content=~${content_q}"
printf '%s\n' "$content"
;;
*)
printf '%s\n' "$1"
;;
esac
}
How about this:
path=`realpath "$1"`
Or:
path=`readlink -f "$1"`
A safe way to use eval is "$(printf "~/%q" "$dangerous_path")". Note that is bash specific.
#!/bin/bash
relativepath=a/b/c
eval homedir="$(printf "~/%q" "$relativepath")"
echo $homedir # prints home path
See this question for details
Also, note that under zsh this would be as as simple as echo ${~dangerous_path}
Here is a ridiculous solution:
$ echo "echo $var" | bash
An explanation of what this command does:
create a new instance of bash, by... calling bash;
take the string "echo $var" and substitute $var with the value of the variable (thus after the substitution the string will contain the tilde);
take the string produced by step 2 and send it to the instance of bash created in step one, which we do here by calling echo and piping its output with the | character.
Basically the current bash instance we're running takes our place as the user of another bash instance and types in the command "echo ~..." for us.
Expanding (no pun intended) on birryree's and halloleo's answers: The general approach is to use eval, but it comes with some important caveats, namely spaces and output redirection (>) in the variable. The following seems to work for me:
mypath="$1"
if [ -e "`eval echo ${mypath//>}`" ]; then
echo "FOUND $mypath"
else
echo "$mypath NOT FOUND"
fi
Try it with each of the following arguments:
'~'
'~/existing_file'
'~/existing file with spaces'
'~/nonexistant_file'
'~/nonexistant file with spaces'
'~/string containing > redirection'
'~/string containing > redirection > again and >> again'
Explanation
The ${mypath//>} strips out > characters which could clobber a file during the eval.
The eval echo ... is what does the actual tilde expansion
The double-quotes around the -e argument are for support of filenames with spaces.
Perhaps there's a more elegant solution, but this is what I was able to come up with.
why not delve straight into getting the user's home directory with getent?
$ getent passwd mike | cut -d: -f6
/users/mike
I believe this is what you're looking for
magic() { # returns unexpanded tilde express on invalid user
local _safe_path; printf -v _safe_path "%q" "$1"
eval "ln -sf ${_safe_path#\\} /tmp/realpath.$$"
readlink /tmp/realpath.$$
rm -f /tmp/realpath.$$
}
Example usage:
$ magic ~nobody/would/look/here
/var/empty/would/look/here
$ magic ~invalid/this/will/not/expand
~invalid/this/will/not/expand
Here is the POSIX function equivalent of Håkon Hægland's Bash answer
expand_tilde() {
tilde_less="${1#\~/}"
[ "$1" != "$tilde_less" ] && tilde_less="$HOME/$tilde_less"
printf '%s' "$tilde_less"
}
2017-12-10 edit: add '%s' per #CharlesDuffy in the comments.
Here's my solution:
#!/bin/bash
expandTilde()
{
local tilde_re='^(~[A-Za-z0-9_.-]*)(.*)'
local path="$*"
local pathSuffix=
if [[ $path =~ $tilde_re ]]
then
# only use eval on the ~username portion !
path=$(eval echo ${BASH_REMATCH[1]})
pathSuffix=${BASH_REMATCH[2]}
fi
echo "${path}${pathSuffix}"
}
result=$(expandTilde "$1")
echo "Result = $result"
Simplest: replace 'magic' with 'eval echo'.
$ eval echo "~"
/whatever/the/f/the/home/directory/is
Problem: You're going to run into issues with other variables because eval is evil. For instance:
$ # home is /Users/Hacker$(s)
$ s="echo SCARY COMMAND"
$ eval echo $(eval echo "~")
/Users/HackerSCARY COMMAND
Note that the issue of the injection doesn't happen on the first expansion. So if you were to simply replace magic with eval echo, you should be okay. But if you do echo $(eval echo ~), that would be susceptible to injection.
Similarly, if you do eval echo ~ instead of eval echo "~", that would count as twice expanded and therefore injection would be possible right away.
For anyone's reference, a function to mimic python's os.path.expanduser() behavior (no eval usage):
# _expand_homedir_tilde ~/.vim
/root/.vim
# _expand_homedir_tilde ~myuser/.vim
/home/myuser/.vim
# _expand_homedir_tilde ~nonexistent/.vim
~nonexistent/.vim
# _expand_homedir_tilde /full/path
/full/path
And the function:
function _expand_homedir_tilde {
(
set -e
set -u
p="$1"
if [[ "$p" =~ ^~ ]]; then
u=`echo "$p" | sed 's|^~\([a-z0-9_-]*\)/.*|\1|'`
if [ -z "$u" ]; then
u=`whoami`
fi
h=$(set -o pipefail; getent passwd "$u" | cut -d: -f6) || exit 1
p=`echo "$p" | sed "s|^~[a-z0-9_-]*/|${h}/|"`
fi
echo $p
) || echo $1
}
Just to extend birryree's answer for paths with spaces: You cannot use the eval command as is because it seperates evaluation by spaces. One solution is to replace spaces temporarily for the eval command:
mypath="~/a/b/c/Something With Spaces"
expandedpath=${mypath// /_spc_} # replace spaces
eval expandedpath=${expandedpath} # put spaces back
expandedpath=${expandedpath//_spc_/ }
echo "$expandedpath" # prints e.g. /Users/fred/a/b/c/Something With Spaces"
ls -lt "$expandedpath" # outputs dir content
This example relies of course on the assumption that mypath never contains the char sequence "_spc_".
You might find this easier to do in python.
(1) From the unix command line:
python -c 'import os; import sys; print os.path.expanduser(sys.argv[1])' ~/fred
Results in:
/Users/someone/fred
(2) Within a bash script as a one-off - save this as test.sh:
#!/usr/bin/env bash
thepath=$(python -c 'import os; import sys; print os.path.expanduser(sys.argv[1])' $1)
echo $thepath
Running bash ./test.sh results in:
/Users/someone/fred
(3) As a utility - save this as expanduser somewhere on your path, with execute permissions:
#!/usr/bin/env python
import sys
import os
print os.path.expanduser(sys.argv[1])
This could then be used on the command line:
expanduser ~/fred
Or in a script:
#!/usr/bin/env bash
thepath=$(expanduser $1)
echo $thepath
Just use eval correctly: with validation.
case $1${1%%/*} in
([!~]*|"$1"?*[!-+_.[:alnum:]]*|"") ! :;;
(*/*) set "${1%%/*}" "${1#*/}" ;;
(*) set "$1"
esac&& eval "printf '%s\n' $1${2+/\"\$2\"}"
I have done this with variable parameter substitution after reading in the path using read -e (among others). So the user can tab-complete the path, and if the user enters a ~ path it gets sorted.
read -rep "Enter a path: " -i "${testpath}" testpath
testpath="${testpath/#~/${HOME}}"
ls -al "${testpath}"
The added benefit is that if there is no tilde nothing happens to the variable, and if there is a tilde but not in the first position it is also ignored.
(I include the -i for read since I use this in a loop so the user can fix the path if there is a problem.)
for some reason when the string is already quoted only perl saves the day
#val="${val/#\~/$HOME}" # for some reason does not work !!
val=$(echo $val|perl -ne 's|~|'$HOME'|g;print')
I think that
thepath=( ~/abc/def/ghi )
is easier than all the other solutions... or I am missing something? It works even if the path does not really exists.

Can I make a shell function in as a pipeline conditionally "disappear", without using cat?

I have a bash script that produces some text from a pipe of commands. Based on a command line option I want to do some validation on the output. For a contrived example...
CHECK_OUTPUT=$1
...
check_output()
{
if [[ "$CHECK_OUTPUT" != "--check" ]]; then
# Don't check the output. Passthrough and return.
cat
return 0
fi
# Check each line exists in the fs root
while read line; do
if [[ ! -e "/$line" ]]; then
echo "Error: /$line does not exist"
return 1
fi
echo "$line"
done
return 0
}
ls /usr | grep '^b' | check_output
[EDIT] better example: https://stackoverflow.com/a/52539364/1888983
This is really useful, particularly if I have multiple functions that can becomes passthroughs. Yes, I could move the CHECK_OUTPUT conditional and create a pipe with or without check_output but I'd need to write lines for each combination for more functions. If there are better ways to dynamically build a pipe I'd like to know.
The problem is the "useless use of cat". Can this be avoided and make check_output like it wasn't in the pipe at all?
Yes, you can do this -- by making your function a wrapper that conditionally injects a pipeline element, instead of being an unconditional pipeline element itself. For example:
maybe_checked() {
if [[ $CHECK_OUTPUT != "--check" ]]; then
"$#" # just run our arguments as a command, as if we weren't here
else
# run our arguments in a process substitution, reading from stdout of same.
# ...some changes from the original code:
# IFS= stops leading or trailing whitespace from being stripped
# read -r prevents backslashes from being processed
local line # avoid modifying $line outside our function
while IFS= read -r line; do
[[ -e "/$line" ]] || { echo "Error: /$line does not exist" >&2; return 1; }
printf '%s\n' "$line" # see https://unix.stackexchange.com/questions/65803
done < <("$#")
fi
}
ls /usr | maybe_checked grep '^b'
Caveat of the above code: if the pipefail option is set, you'll want to check the exit status of the process substitution to have complete parity with the behavior that would otherwise be the case. In bash version 4.3 or later (IIRC), $? is modified by process substitutions to have the relevant PID, which can be waited for to retrieve exit status.
That said, this is also a use case wherein using cat is acceptable, and I'm saying this as a card-carying member of the UUOC crowd. :)
Adopting the examples from John Kugelman's answers on the linked question:
maybe_sort() {
if (( sort )); then
"$#" | sort
else
"$#"
fi
}
maybe_limit() {
if [[ -n $limit ]]; then
"$#" | head -n "$limit"
else
"$#"
fi
}
printf '%s\n' "${haikus[#]}" | maybe_limit maybe_sort sed -e 's/^[ \t]*//'

Can't add a new element to an array in bash [duplicate]

In the following program, if I set the variable $foo to the value 1 inside the first if statement, it works in the sense that its value is remembered after the if statement. However, when I set the same variable to the value 2 inside an if which is inside a while statement, it's forgotten after the while loop. It's behaving like I'm using some sort of copy of the variable $foo inside the while loop and I am modifying only that particular copy. Here's a complete test program:
#!/bin/bash
set -e
set -u
foo=0
bar="hello"
if [[ "$bar" == "hello" ]]
then
foo=1
echo "Setting \$foo to 1: $foo"
fi
echo "Variable \$foo after if statement: $foo"
lines="first line\nsecond line\nthird line"
echo -e $lines | while read line
do
if [[ "$line" == "second line" ]]
then
foo=2
echo "Variable \$foo updated to $foo inside if inside while loop"
fi
echo "Value of \$foo in while loop body: $foo"
done
echo "Variable \$foo after while loop: $foo"
# Output:
# $ ./testbash.sh
# Setting $foo to 1: 1
# Variable $foo after if statement: 1
# Value of $foo in while loop body: 1
# Variable $foo updated to 2 inside if inside while loop
# Value of $foo in while loop body: 2
# Value of $foo in while loop body: 2
# Variable $foo after while loop: 1
# bash --version
# GNU bash, version 4.1.10(4)-release (i686-pc-cygwin)
echo -e $lines | while read line
...
done
The while loop is executed in a subshell. So any changes you do to the variable will not be available once the subshell exits.
Instead you can use a here string to re-write the while loop to be in the main shell process; only echo -e $lines will run in a subshell:
while read line
do
if [[ "$line" == "second line" ]]
then
foo=2
echo "Variable \$foo updated to $foo inside if inside while loop"
fi
echo "Value of \$foo in while loop body: $foo"
done <<< "$(echo -e "$lines")"
You can get rid of the rather ugly echo in the here-string above by expanding the backslash sequences immediately when assigning lines. The $'...' form of quoting can be used there:
lines=$'first line\nsecond line\nthird line'
while read line; do
...
done <<< "$lines"
UPDATED#2
Explanation is in Blue Moons's answer.
Alternative solutions:
Eliminate echo
while read line; do
...
done <<EOT
first line
second line
third line
EOT
Add the echo inside the here-is-the-document
while read line; do
...
done <<EOT
$(echo -e $lines)
EOT
Run echo in background:
coproc echo -e $lines
while read -u ${COPROC[0]} line; do
...
done
Redirect to a file handle explicitly (Mind the space in < <!):
exec 3< <(echo -e $lines)
while read -u 3 line; do
...
done
Or just redirect to the stdin:
while read line; do
...
done < <(echo -e $lines)
And one for chepner (eliminating echo):
arr=("first line" "second line" "third line");
for((i=0;i<${#arr[*]};++i)) { line=${arr[i]};
...
}
Variable $lines can be converted to an array without starting a new sub-shell. The characters \ and n has to be converted to some character (e.g. a real new line character) and use the IFS (Internal Field Separator) variable to split the string into array elements. This can be done like:
lines="first line\nsecond line\nthird line"
echo "$lines"
OIFS="$IFS"
IFS=$'\n' arr=(${lines//\\n/$'\n'}) # Conversion
IFS="$OIFS"
echo "${arr[#]}", Length: ${#arr[*]}
set|grep ^arr
Result is
first line\nsecond line\nthird line
first line second line third line, Length: 3
arr=([0]="first line" [1]="second line" [2]="third line")
You are asking this bash FAQ. The answer also describes the general case of variables set in subshells created by pipes:
E4) If I pipe the output of a command into read variable, why
doesn't the output show up in $variable when the read command finishes?
This has to do with the parent-child relationship between Unix
processes. It affects all commands run in pipelines, not just
simple calls to read. For example, piping a command's output
into a while loop that repeatedly calls read will result in
the same behavior.
Each element of a pipeline, even a builtin or shell function,
runs in a separate process, a child of the shell running the
pipeline. A subprocess cannot affect its parent's environment.
When the read command sets the variable to the input, that
variable is set only in the subshell, not the parent shell. When
the subshell exits, the value of the variable is lost.
Many pipelines that end with read variable can be converted
into command substitutions, which will capture the output of
a specified command. The output can then be assigned to a
variable:
grep ^gnu /usr/lib/news/active | wc -l | read ngroup
can be converted into
ngroup=$(grep ^gnu /usr/lib/news/active | wc -l)
This does not, unfortunately, work to split the text among
multiple variables, as read does when given multiple variable
arguments. If you need to do this, you can either use the
command substitution above to read the output into a variable
and chop up the variable using the bash pattern removal
expansion operators or use some variant of the following
approach.
Say /usr/local/bin/ipaddr is the following shell script:
#! /bin/sh
host `hostname` | awk '/address/ {print $NF}'
Instead of using
/usr/local/bin/ipaddr | read A B C D
to break the local machine's IP address into separate octets, use
OIFS="$IFS"
IFS=.
set -- $(/usr/local/bin/ipaddr)
IFS="$OIFS"
A="$1" B="$2" C="$3" D="$4"
Beware, however, that this will change the shell's positional
parameters. If you need them, you should save them before doing
this.
This is the general approach -- in most cases you will not need to
set $IFS to a different value.
Some other user-supplied alternatives include:
read A B C D << HERE
$(IFS=.; echo $(/usr/local/bin/ipaddr))
HERE
and, where process substitution is available,
read A B C D < <(IFS=.; echo $(/usr/local/bin/ipaddr))
Hmmm... I would almost swear that this worked for the original Bourne shell, but don't have access to a running copy just now to check.
There is, however, a very trivial workaround to the problem.
Change the first line of the script from:
#!/bin/bash
to
#!/bin/ksh
Et voila! A read at the end of a pipeline works just fine, assuming you have the Korn shell installed.
This is an interesting question and touches on a very basic concept in Bourne shell and subshell. Here I provide a solution that is different from the previous solutions by doing some kind of filtering. I will give an example that may be useful in real life. This is a fragment for checking that downloaded files conform to a known checksum. The checksum file look like the following (Showing just 3 lines):
49174 36326 dna_align_feature.txt.gz
54757 1 dna.txt.gz
55409 9971 exon_transcript.txt.gz
The shell script:
#!/bin/sh
.....
failcnt=0 # this variable is only valid in the parent shell
#variable xx captures all the outputs from the while loop
xx=$(cat ${checkfile} | while read -r line; do
num1=$(echo $line | awk '{print $1}')
num2=$(echo $line | awk '{print $2}')
fname=$(echo $line | awk '{print $3}')
if [ -f "$fname" ]; then
res=$(sum $fname)
filegood=$(sum $fname | awk -v na=$num1 -v nb=$num2 -v fn=$fname '{ if (na == $1 && nb == $2) { print "TRUE"; } else { print "FALSE"; }}')
if [ "$filegood" = "FALSE" ]; then
failcnt=$(expr $failcnt + 1) # only in subshell
echo "$fname BAD $failcnt"
fi
fi
done | tail -1) # I am only interested in the final result
# you can capture a whole bunch of texts and do further filtering
failcnt=${xx#* BAD } # I am only interested in the number
# this variable is in the parent shell
echo failcnt $failcnt
if [ $failcnt -gt 0 ]; then
echo $failcnt files failed
else
echo download successful
fi
The parent and subshell communicate through the echo command. You can pick some easy to parse text for the parent shell. This method does not break your normal way of thinking, just that you have to do some post processing. You can use grep, sed, awk, and more for doing so.
I use stderr to store within a loop, and read from it outside.
Here var i is initially set and read inside the loop as 1.
# reading lines of content from 2 files concatenated
# inside loop: write value of var i to stderr (before iteration)
# outside: read var i from stderr, has last iterative value
f=/tmp/file1
g=/tmp/file2
i=1
cat $f $g | \
while read -r s;
do
echo $s > /dev/null; # some work
echo $i > 2
let i++
done;
read -r i < 2
echo $i
Or use the heredoc method to reduce the amount of code in a subshell.
Note the iterative i value can be read outside the while loop.
i=1
while read -r s;
do
echo $s > /dev/null
let i++
done <<EOT
$(cat $f $g)
EOT
let i--
echo $i
How about a very simple method
+call your while loop in a function
- set your value inside (nonsense, but shows the example)
- return your value inside
+capture your value outside
+set outside
+display outside
#!/bin/bash
# set -e
# set -u
# No idea why you need this, not using here
foo=0
bar="hello"
if [[ "$bar" == "hello" ]]
then
foo=1
echo "Setting \$foo to $foo"
fi
echo "Variable \$foo after if statement: $foo"
lines="first line\nsecond line\nthird line"
function my_while_loop
{
echo -e $lines | while read line
do
if [[ "$line" == "second line" ]]
then
foo=2; return 2;
echo "Variable \$foo updated to $foo inside if inside while loop"
fi
echo -e $lines | while read line
do
if [[ "$line" == "second line" ]]
then
foo=2;
echo "Variable \$foo updated to $foo inside if inside while loop"
return 2;
fi
# Code below won't be executed since we returned from function in 'if' statement
# We aready reported the $foo var beint set to 2 anyway
echo "Value of \$foo in while loop body: $foo"
done
}
my_while_loop; foo="$?"
echo "Variable \$foo after while loop: $foo"
Output:
Setting $foo 1
Variable $foo after if statement: 1
Value of $foo in while loop body: 1
Variable $foo after while loop: 2
bash --version
GNU bash, version 3.2.51(1)-release (x86_64-apple-darwin13)
Copyright (C) 2007 Free Software Foundation, Inc.
Though this is an old question and asked several times, here's what I'm doing after hours fidgeting with here strings, and the only option that worked for me is to store the value in a file during while loop sub-shells and then retrieve it. Simple.
Use echo statement to store and cat statement to retrieve. And the bash user must chown the directory or have read-write chmod access.
#write to file
echo "1" > foo.txt
while condition; do
if (condition); then
#write again to file
echo "2" > foo.txt
fi
done
#read from file
echo "Value of \$foo in while loop body: $(cat foo.txt)"

Using shell script code passed as an argument to a function

having issue getting command to execute threw a function in a BASH script.
The command: [named -V|grep BIND|awk '{printf ($2);}'] works in a shell but will not set the output to a varable.
Desired output for $VER should be: 9.8.1-P1
I believe the issue is the |
However, I am receiving:
BIND 9.8.1-P1 built with '--prefix=/usr' '--mandir=/usr/share/man' '--infodir=/usr/share/info' '--sysconfdir=/etc/bind' '--localstatedir=/var' '--enable-threads' '--enable-largefile' '--with-libtool' '--enable-shared' '--enable-static' '--with-openssl=/usr' '--with-gssapi=/usr' '--with-gnu-ld' '--with-geoip=/usr' '--enable-ipv6' 'CFLAGS=-fno-strict-aliasing -DDIG_SIGCHASE -O2' 'LDFLAGS=-Wl,-Bsymbolic-functions -Wl,-z,relro' 'CPPFLAGS=-D_FORTIFY_SOURCE=2'
if you have any info please let me know
#!/bin/bash
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games
function version {
if [ `builtin type -p $3` ]; then
VER=`$2`
if [[ -n $VER ]]; then
# echo "$VER" > $DIR/ver/$1
echo "VER=$VER"
PID=$(pidof $3)
if [[ -n "$PID" ]]; then
# echo "$PID" > $DIR/pid/$1
echo "PID=$PID"
fi
fi
else
echo "$1 not installed"
fi
}
version "bind" "named -V|grep BIND|awk '{printf ($2);}'" "named"
You want: VER=$(eval "$2") -- The quotes are very important to contain the eval'ed script as a single word.
You will also need to alter the 2nd argument:
"named -V|awk '/BIND/ {print \$2}'"
# ^^^
Without the backslash, the shell would see $2 inside double quotes and substitute it.
The grep is removed simply because it's not necessary: awk can search for patterns.
See BashFAQ #50 for a detailed discussion of why commands should not be stored in strings (and how and why this fails in practice), and BashFAQ #48 describing why eval in particular is error-prone.
A far safer approach is to store code in functions, and pass those functions by name:
get_named_version() { named -V | awk '/BIND/ {print $2}'; }
version bind get_named_version named
...will work correctly with your original function.

Resources