assigning output to a variable using echo command - shell

The below code does not give any output:
$echo `cat time`
19991213100942
$a=$(echo `cat time`) | echo $a | echo ${a:0:4}
Please tell where I am making mistake.

a=$(echo `cat time`)
assigns the output of the command inside the brackets $(...) to the variable $a.
Later in the script, you can print the variable:
echo $a
That prints: 19991213100942
echo ${a:0:4}
That prints: 1999
You can reference the varibale by its name $a.

First, you don't need to echo the output of cat time: just cat time.
Second, as #Etan says (kind of), replace the pipes with semicolons or newlines
a=$(< time) # a bash builtin, equivalent to but faster than: a=$(cat time)
echo $a
echo ${a:0:4}

Related

Can't add a new element to an array in bash [duplicate]

In the following program, if I set the variable $foo to the value 1 inside the first if statement, it works in the sense that its value is remembered after the if statement. However, when I set the same variable to the value 2 inside an if which is inside a while statement, it's forgotten after the while loop. It's behaving like I'm using some sort of copy of the variable $foo inside the while loop and I am modifying only that particular copy. Here's a complete test program:
#!/bin/bash
set -e
set -u
foo=0
bar="hello"
if [[ "$bar" == "hello" ]]
then
foo=1
echo "Setting \$foo to 1: $foo"
fi
echo "Variable \$foo after if statement: $foo"
lines="first line\nsecond line\nthird line"
echo -e $lines | while read line
do
if [[ "$line" == "second line" ]]
then
foo=2
echo "Variable \$foo updated to $foo inside if inside while loop"
fi
echo "Value of \$foo in while loop body: $foo"
done
echo "Variable \$foo after while loop: $foo"
# Output:
# $ ./testbash.sh
# Setting $foo to 1: 1
# Variable $foo after if statement: 1
# Value of $foo in while loop body: 1
# Variable $foo updated to 2 inside if inside while loop
# Value of $foo in while loop body: 2
# Value of $foo in while loop body: 2
# Variable $foo after while loop: 1
# bash --version
# GNU bash, version 4.1.10(4)-release (i686-pc-cygwin)
echo -e $lines | while read line
...
done
The while loop is executed in a subshell. So any changes you do to the variable will not be available once the subshell exits.
Instead you can use a here string to re-write the while loop to be in the main shell process; only echo -e $lines will run in a subshell:
while read line
do
if [[ "$line" == "second line" ]]
then
foo=2
echo "Variable \$foo updated to $foo inside if inside while loop"
fi
echo "Value of \$foo in while loop body: $foo"
done <<< "$(echo -e "$lines")"
You can get rid of the rather ugly echo in the here-string above by expanding the backslash sequences immediately when assigning lines. The $'...' form of quoting can be used there:
lines=$'first line\nsecond line\nthird line'
while read line; do
...
done <<< "$lines"
UPDATED#2
Explanation is in Blue Moons's answer.
Alternative solutions:
Eliminate echo
while read line; do
...
done <<EOT
first line
second line
third line
EOT
Add the echo inside the here-is-the-document
while read line; do
...
done <<EOT
$(echo -e $lines)
EOT
Run echo in background:
coproc echo -e $lines
while read -u ${COPROC[0]} line; do
...
done
Redirect to a file handle explicitly (Mind the space in < <!):
exec 3< <(echo -e $lines)
while read -u 3 line; do
...
done
Or just redirect to the stdin:
while read line; do
...
done < <(echo -e $lines)
And one for chepner (eliminating echo):
arr=("first line" "second line" "third line");
for((i=0;i<${#arr[*]};++i)) { line=${arr[i]};
...
}
Variable $lines can be converted to an array without starting a new sub-shell. The characters \ and n has to be converted to some character (e.g. a real new line character) and use the IFS (Internal Field Separator) variable to split the string into array elements. This can be done like:
lines="first line\nsecond line\nthird line"
echo "$lines"
OIFS="$IFS"
IFS=$'\n' arr=(${lines//\\n/$'\n'}) # Conversion
IFS="$OIFS"
echo "${arr[#]}", Length: ${#arr[*]}
set|grep ^arr
Result is
first line\nsecond line\nthird line
first line second line third line, Length: 3
arr=([0]="first line" [1]="second line" [2]="third line")
You are asking this bash FAQ. The answer also describes the general case of variables set in subshells created by pipes:
E4) If I pipe the output of a command into read variable, why
doesn't the output show up in $variable when the read command finishes?
This has to do with the parent-child relationship between Unix
processes. It affects all commands run in pipelines, not just
simple calls to read. For example, piping a command's output
into a while loop that repeatedly calls read will result in
the same behavior.
Each element of a pipeline, even a builtin or shell function,
runs in a separate process, a child of the shell running the
pipeline. A subprocess cannot affect its parent's environment.
When the read command sets the variable to the input, that
variable is set only in the subshell, not the parent shell. When
the subshell exits, the value of the variable is lost.
Many pipelines that end with read variable can be converted
into command substitutions, which will capture the output of
a specified command. The output can then be assigned to a
variable:
grep ^gnu /usr/lib/news/active | wc -l | read ngroup
can be converted into
ngroup=$(grep ^gnu /usr/lib/news/active | wc -l)
This does not, unfortunately, work to split the text among
multiple variables, as read does when given multiple variable
arguments. If you need to do this, you can either use the
command substitution above to read the output into a variable
and chop up the variable using the bash pattern removal
expansion operators or use some variant of the following
approach.
Say /usr/local/bin/ipaddr is the following shell script:
#! /bin/sh
host `hostname` | awk '/address/ {print $NF}'
Instead of using
/usr/local/bin/ipaddr | read A B C D
to break the local machine's IP address into separate octets, use
OIFS="$IFS"
IFS=.
set -- $(/usr/local/bin/ipaddr)
IFS="$OIFS"
A="$1" B="$2" C="$3" D="$4"
Beware, however, that this will change the shell's positional
parameters. If you need them, you should save them before doing
this.
This is the general approach -- in most cases you will not need to
set $IFS to a different value.
Some other user-supplied alternatives include:
read A B C D << HERE
$(IFS=.; echo $(/usr/local/bin/ipaddr))
HERE
and, where process substitution is available,
read A B C D < <(IFS=.; echo $(/usr/local/bin/ipaddr))
Hmmm... I would almost swear that this worked for the original Bourne shell, but don't have access to a running copy just now to check.
There is, however, a very trivial workaround to the problem.
Change the first line of the script from:
#!/bin/bash
to
#!/bin/ksh
Et voila! A read at the end of a pipeline works just fine, assuming you have the Korn shell installed.
This is an interesting question and touches on a very basic concept in Bourne shell and subshell. Here I provide a solution that is different from the previous solutions by doing some kind of filtering. I will give an example that may be useful in real life. This is a fragment for checking that downloaded files conform to a known checksum. The checksum file look like the following (Showing just 3 lines):
49174 36326 dna_align_feature.txt.gz
54757 1 dna.txt.gz
55409 9971 exon_transcript.txt.gz
The shell script:
#!/bin/sh
.....
failcnt=0 # this variable is only valid in the parent shell
#variable xx captures all the outputs from the while loop
xx=$(cat ${checkfile} | while read -r line; do
num1=$(echo $line | awk '{print $1}')
num2=$(echo $line | awk '{print $2}')
fname=$(echo $line | awk '{print $3}')
if [ -f "$fname" ]; then
res=$(sum $fname)
filegood=$(sum $fname | awk -v na=$num1 -v nb=$num2 -v fn=$fname '{ if (na == $1 && nb == $2) { print "TRUE"; } else { print "FALSE"; }}')
if [ "$filegood" = "FALSE" ]; then
failcnt=$(expr $failcnt + 1) # only in subshell
echo "$fname BAD $failcnt"
fi
fi
done | tail -1) # I am only interested in the final result
# you can capture a whole bunch of texts and do further filtering
failcnt=${xx#* BAD } # I am only interested in the number
# this variable is in the parent shell
echo failcnt $failcnt
if [ $failcnt -gt 0 ]; then
echo $failcnt files failed
else
echo download successful
fi
The parent and subshell communicate through the echo command. You can pick some easy to parse text for the parent shell. This method does not break your normal way of thinking, just that you have to do some post processing. You can use grep, sed, awk, and more for doing so.
I use stderr to store within a loop, and read from it outside.
Here var i is initially set and read inside the loop as 1.
# reading lines of content from 2 files concatenated
# inside loop: write value of var i to stderr (before iteration)
# outside: read var i from stderr, has last iterative value
f=/tmp/file1
g=/tmp/file2
i=1
cat $f $g | \
while read -r s;
do
echo $s > /dev/null; # some work
echo $i > 2
let i++
done;
read -r i < 2
echo $i
Or use the heredoc method to reduce the amount of code in a subshell.
Note the iterative i value can be read outside the while loop.
i=1
while read -r s;
do
echo $s > /dev/null
let i++
done <<EOT
$(cat $f $g)
EOT
let i--
echo $i
How about a very simple method
+call your while loop in a function
- set your value inside (nonsense, but shows the example)
- return your value inside
+capture your value outside
+set outside
+display outside
#!/bin/bash
# set -e
# set -u
# No idea why you need this, not using here
foo=0
bar="hello"
if [[ "$bar" == "hello" ]]
then
foo=1
echo "Setting \$foo to $foo"
fi
echo "Variable \$foo after if statement: $foo"
lines="first line\nsecond line\nthird line"
function my_while_loop
{
echo -e $lines | while read line
do
if [[ "$line" == "second line" ]]
then
foo=2; return 2;
echo "Variable \$foo updated to $foo inside if inside while loop"
fi
echo -e $lines | while read line
do
if [[ "$line" == "second line" ]]
then
foo=2;
echo "Variable \$foo updated to $foo inside if inside while loop"
return 2;
fi
# Code below won't be executed since we returned from function in 'if' statement
# We aready reported the $foo var beint set to 2 anyway
echo "Value of \$foo in while loop body: $foo"
done
}
my_while_loop; foo="$?"
echo "Variable \$foo after while loop: $foo"
Output:
Setting $foo 1
Variable $foo after if statement: 1
Value of $foo in while loop body: 1
Variable $foo after while loop: 2
bash --version
GNU bash, version 3.2.51(1)-release (x86_64-apple-darwin13)
Copyright (C) 2007 Free Software Foundation, Inc.
Though this is an old question and asked several times, here's what I'm doing after hours fidgeting with here strings, and the only option that worked for me is to store the value in a file during while loop sub-shells and then retrieve it. Simple.
Use echo statement to store and cat statement to retrieve. And the bash user must chown the directory or have read-write chmod access.
#write to file
echo "1" > foo.txt
while condition; do
if (condition); then
#write again to file
echo "2" > foo.txt
fi
done
#read from file
echo "Value of \$foo in while loop body: $(cat foo.txt)"

Using sed to get only the digits from all parameters to a bash script

I need a help, please
I have script
#!/bin/bash
NUMBER=$(echo "$1" | sed 's/[^0-9]*//g')
echo $NUMBER
./test.sh tas1 tefst2 thgst3 ynft4 jhuf5 hjh6 jhd7
1
returns only 1 but I need it 1234567
The reason you only get "1" is because $1 is only the first parameter. Use "$*" to get a single string containing all the parameters, separated by spaces (by default).
You can also do this with bash variable substitution:
params="$*"
echo "${params//[^[:digit:]]/}"

how to pass multi-line output from pipe to a variable in bash? without using temp file

I have a complex grep/awk/etc command line which use some " some $var already, which makes it even impossible to use
VAR="$( that command )"
to get all output
I don't want to create temp files, which make it even ugly,
is it possible to pass pipe output into a variable in bash
like
command | > $VAR
Just use the : VAR=$(complex command ), writing complex command exactly as you would if you were writing it on the next line.
Ex: if you have
foo=1
bar="2 3"
awk -v myfoo="$foo" -v mybar="$bar" '..... complex
awk
script here .....'
you could put that into VAR with:
foo=1
bar="2 3"
VAR="$( awk -v myfoo="$foo" -v mybar="$bar" '..... complex
awk
script here .....' )"
ie, once inside $(...), bash is reading things as if it was at the "first level". It works as $(...) is evaluated first before the line containing it is evaluated.
You can do a for loop as following (Just a warning, for loop split output by space. So you'll got a word by iteration):
#!/usr/bin/ksh
for RC_OUTPUT in $(./testEcho.ksh)
do
echo $RC_OUTPUT
done
Here a more complex solution (max line lenght = 80):
#!/usr/bin/ksh
LINE_LENGTH_MAX=80
LINE_LENGTH=0
for RC_OUTPUT in $(./testEcho.ksh)
do
CURRENT_LENGTH="$(expr length $RC_OUTPUT)"
(( LINE_LENGTH = LINE_LENGTH + CURRENT_LENGTH + 1 ))
if [[ $LINE_LENGTH -gt $LINE_LENGTH_MAX ]]
then
LINE_LENGTH=$CURRENT_LENGTH
echo $RC_OUTPUT
else
echo -n "$RC_OUTPUT "
fi
done

Bash script - variable content as a command to run

I have a Perl script that gives me a defined list of random numbers that correspond to the lines of a file. Next I want to extract those lines from the file using sed.
#!/bin/bash
count=$(cat last_queries.txt | wc -l)
var=$(perl test.pl test2 $count)
The variable var returns an output like: cat last_queries.txt | sed -n '12p;500p;700p'. The problem is that I can't run this last command. I tried with $var, but the output is not correct (if I run manually the command it works fine, so no problem there). What is the correct way to do this?
P.S: Sure I could do all the work in Perl, but I'm trying to learn this way, because it could help me in other situations.
You just need to do:
#!/bin/bash
count=$(cat last_queries.txt | wc -l)
$(perl test.pl test2 $count)
However, if you want to call your Perl command later, and that's why you want to assign it to a variable, then:
#!/bin/bash
count=$(cat last_queries.txt | wc -l)
var="perl test.pl test2 $count" # You need double quotes to get your $count value substituted.
...stuff...
eval $var
As per Bash's help:
~$ help eval
eval: eval [arg ...]
Execute arguments as a shell command.
Combine ARGs into a single string, use the result as input to the shell,
and execute the resulting commands.
Exit Status:
Returns exit status of command or success if command is null.
You're are probably looking for eval $var.
line=$((${RANDOM} % $(wc -l < /etc/passwd)))
sed -n "${line}p" /etc/passwd
just with your file instead.
In this example I used the file /etc/password, using the special variable ${RANDOM} (about which I learned here), and the sed expression you had, only difference is that I am using double quotes instead of single to allow the variable expansion.
There're 2 basic ways of executing a string command in a shell script whether it's given as parameter or not here's.
COMMAND="ls -lah"
$(echo $COMMAND)
or
COMMAND="ls -lah"
bash -c $COMMAND
In the case where you have multiple variables containing the arguments for a command you're running, and not just a single string, you should not use eval directly, as it will fail in the following case:
function echo_arguments() {
echo "Argument 1: $1"
echo "Argument 2: $2"
echo "Argument 3: $3"
echo "Argument 4: $4"
}
# Note we are passing 3 arguments to `echo_arguments`, not 4
eval echo_arguments arg1 arg2 "Some arg"
Result:
Argument 1: arg1
Argument 2: arg2
Argument 3: Some
Argument 4: arg
Note that even though "Some arg" was passed as a single argument, eval read it as two.
Instead, you can just use the string as the command itself:
# The regular bash eval works by jamming all its arguments into a string then
# evaluating the string. This function treats its arguments as individual
# arguments to be passed to the command being run.
function eval_command() {
"$#";
}
Note the difference between the output of eval and the new eval_command function:
eval_command echo_arguments arg1 arg2 "Some arg"
Result:
Argument 1: arg1
Argument 2: arg2
Argument 3: Some arg
Argument 4:
Better ways to do it
Using a function:
# define it
myls() {
ls -l "/tmp/test/my dir"
}
# run it
myls
Using an array:
# define the array
mycmd=(ls -l "/tmp/test/my dir")
# run the command
"${mycmd[#]}"
cmd="ls -atr ${HOME} | tail -1" <br/>
echo "$cmd" <br/>
VAR_FIRST_FILE=$( eval "${cmd}" ) <br/>
or
cmd=("ls -atr ${HOME} | tail -1") <br/>
echo "$cmd" <br/>
VAR_FIRST_FILE=$( eval "${cmd[#]}" )

Indirect parameter substitution in shell script

I'm having a problem with a shell script (POSIX shell under HP-UX, FWIW). I have a function called print_arg into which I'm passing the name of a parameter as $1. Given the name of the parameter, I then want to print the name and the value of that parameter. However, I keep getting an error. Here's an example of what I'm trying to do:
#!/usr/bin/sh
function print_arg
{
# $1 holds the name of the argument to be shown
arg=$1
# The following line errors off with
# ./test_print.sh[9]: argval=${"$arg"}: The specified substitution is not valid for this command.
argval=${"$arg"}
if [[ $argval != '' ]] ; then
printf "ftp_func: $arg='$argval'\n"
fi
}
COMMAND="XYZ"
print_arg "COMMAND"
I've tried re-writing the offending line every way I can think of. I've consulted the local oracles. I've checked the online "BASH Scripting Guide". And I sharpened up the ol' wavy-bladed knife and scrubbed the altar until it gleamed, but then I discovered that our local supply of virgins has been cut down to, like, nothin'. Drat!
Any advice regarding how to get the value of a parameter whose name is passed into a function as a parameter will be received appreciatively.
You could use eval, though using direct indirection as suggested by SiegeX is probably nicer if you can use bash.
#!/bin/sh
foo=bar
print_arg () {
arg=$1
eval argval=\"\$$arg\"
echo "$argval"
}
print_arg foo
In bash (but not in other sh implementations), indirection is done by: ${!arg}
Input
foo=bar
bar=baz
echo $foo
echo ${!foo}
Output
bar
baz
This worked surprisingly well:
#!/bin/sh
foo=bar
print_arg () {
local line name value
set | \
while read line; do
name=${line%=*} value=${line#*=\'}
if [ "$name" = "$1" ]; then
echo ${value%\'}
fi
done
}
print_arg foo
It has all the POSIX clunkiness, in Bash would be much sorter, but then again, you won't need it because you have ${!}. This -in case it proves solid- would have the advantage of using only builtins and no eval. If I were to construct this function using an external command, it would have to be sed. Would obviate the need for the read loop and the substitutions. Mind that asking for indirections in POSIX without eval, has to be paid with clunkiness! So don't beat me!
Even though the answer's already accepted, here's another method for those who need to preserve newlines and special characters like Escape ( \033 ): Storing the variable in base64.
You need: bc, wc, echo, tail, tr, uuencode, uudecode
Example
#!/bin/sh
#====== Definition =======#
varA="a
b
c"
# uuencode the variable
varB="`echo "$varA" | uuencode -m -`"
# Skip the first line of the uuencode output.
varB="`NUM=\`(echo "$varB"|wc -l|tr -d "\n"; echo -1)|bc \`; echo "$varB" | tail -n $NUM)`"
#====== Access =======#
namevar1=varB
namevar2=varA
echo simple eval:
eval "echo \$$namevar2"
echo simple echo:
echo $varB
echo precise echo:
echo "$varB"
echo echo of base64
eval "echo \$$namevar1"
echo echo of base64 - with updated newlines
eval "echo \$$namevar1 | tr ' ' '\n'"
echo echo of un-based, using sh instead of eval (but could be made with eval, too)
export $namevar1
sh -c "(echo 'begin-base64 644 -'; echo \$$namevar1 | tr ' ' '\n' )|uudecode"
Result
simple eval:
a b c
simple echo:
YQpiCmMK ====
precise echo:
YQpiCmMK
====
echo of base64
YQpiCmMK ====
echo of base64 - with updated newlines
YQpiCmMK
====
echo of un-based, using sh instead of eval (but could be made with eval, too)
a
b
c
Alternative
You also could use the set command and parse it's output; with that, you don't need to treat the variable in a special way before it's accessed.
A safer solution with eval:
v=1
valid_var_name='[[:alpha:]_][[:alnum:]_]*$'
print_arg() {
local arg=$1
if ! expr "$arg" : "$valid_var_name" >/dev/null; then
echo "$0: invalid variable name ($arg)" >&2
exit 1
fi
local argval
eval argval=\$$arg
echo "$argval"
}
print_arg v
print_arg 'v; echo test'
Inspired by the following answer.

Resources