Failed to get sum result of multiple asynchronous commands in bash - bash

I'm trying to get the result of multiple commands that run asynchronously, to far I got:
#!/usr/bin/env bash
sum=0
for i in `seq 1 10`;
do
sum+=$(calculationCommand) &
done
wait
echo $sum
But it outputs 0 every time, can someone help me find the mistake and correct it, thanks!

Here's ShellCheck:
Line 6:
sum+=$(calculationCommand) &
^-- SC2030: Modification of sum is local (to subshell caused by backgrounding &).
Line 10:
echo $sum
^-- SC2031: sum was modified in a subshell. That change might be lost.
You can not update variables from other processes. Instead, write the results to a file, wait for them to complete, and then read the data from the files.
Here's an example:
#!/bin/bash
calculationCommand() {
sleep 5
echo 2
}
for i in {1..10}
do
calculationCommand > tmp.$i &
done
wait
sum=0
for number in $(cat tmp.{1..10})
do
(( sum += number ))
done
echo "$sum"
Alternatives include using a fifo instead of 10 files.

Related

Bash: failed to read from a pipe [duplicate]

Bash allows to use: cat <(echo "$FILECONTENT")
Bash also allow to use: while read i; do echo $i; done </etc/passwd
to combine previous two this can be used: echo $FILECONTENT | while read i; do echo $i; done
The problem with last one is that it creates sub-shell and after the while loop ends variable i cannot be accessed any more.
My question is:
How to achieve something like this: while read i; do echo $i; done <(echo "$FILECONTENT") or in other words: How can I be sure that i survives while loop?
Please note that I am aware of enclosing while statement into {} but this does not solves the problem (imagine that you want use the while loop in function and return i variable)
The correct notation for Process Substitution is:
while read i; do echo $i; done < <(echo "$FILECONTENT")
The last value of i assigned in the loop is then available when the loop terminates.
An alternative is:
echo $FILECONTENT |
{
while read i; do echo $i; done
...do other things using $i here...
}
The braces are an I/O grouping operation and do not themselves create a subshell. In this context, they are part of a pipeline and are therefore run as a subshell, but it is because of the |, not the { ... }. You mention this in the question. AFAIK, you can do a return from within these inside a function.
Bash also provides the shopt builtin and one of its many options is:
lastpipe
If set, and job control is not active, the shell runs the last command of a pipeline not executed in the background in the current shell environment.
Thus, using something like this in a script makes the modfied sum available after the loop:
FILECONTENT="12 Name
13 Number
14 Information"
shopt -s lastpipe # Comment this out to see the alternative behaviour
sum=0
echo "$FILECONTENT" |
while read number name; do ((sum+=$number)); done
echo $sum
Doing this at the command line usually runs foul of 'job control is not active' (that is, at the command line, job control is active). Testing this without using a script failed.
Also, as noted by Gareth Rees in his answer, you can sometimes use a here string:
while read i; do echo $i; done <<< "$FILECONTENT"
This doesn't require shopt; you may be able to save a process using it.
Jonathan Leffler explains how to do what you want using process substitution, but another possibility is to use a here string:
while read i; do echo "$i"; done <<<"$FILECONTENT"
This saves a process.
This function makes duplicates $NUM times of jpg files (bash)
function makeDups() {
NUM=$1
echo "Making $1 duplicates for $(ls -1 *.jpg|wc -l) files"
ls -1 *.jpg|sort|while read f
do
COUNT=0
while [ "$COUNT" -le "$NUM" ]
do
cp $f ${f//sm/${COUNT}sm}
((COUNT++))
done
done
}

'read' command for unix not working without while loop even for single line input [duplicate]

Bash allows to use: cat <(echo "$FILECONTENT")
Bash also allow to use: while read i; do echo $i; done </etc/passwd
to combine previous two this can be used: echo $FILECONTENT | while read i; do echo $i; done
The problem with last one is that it creates sub-shell and after the while loop ends variable i cannot be accessed any more.
My question is:
How to achieve something like this: while read i; do echo $i; done <(echo "$FILECONTENT") or in other words: How can I be sure that i survives while loop?
Please note that I am aware of enclosing while statement into {} but this does not solves the problem (imagine that you want use the while loop in function and return i variable)
The correct notation for Process Substitution is:
while read i; do echo $i; done < <(echo "$FILECONTENT")
The last value of i assigned in the loop is then available when the loop terminates.
An alternative is:
echo $FILECONTENT |
{
while read i; do echo $i; done
...do other things using $i here...
}
The braces are an I/O grouping operation and do not themselves create a subshell. In this context, they are part of a pipeline and are therefore run as a subshell, but it is because of the |, not the { ... }. You mention this in the question. AFAIK, you can do a return from within these inside a function.
Bash also provides the shopt builtin and one of its many options is:
lastpipe
If set, and job control is not active, the shell runs the last command of a pipeline not executed in the background in the current shell environment.
Thus, using something like this in a script makes the modfied sum available after the loop:
FILECONTENT="12 Name
13 Number
14 Information"
shopt -s lastpipe # Comment this out to see the alternative behaviour
sum=0
echo "$FILECONTENT" |
while read number name; do ((sum+=$number)); done
echo $sum
Doing this at the command line usually runs foul of 'job control is not active' (that is, at the command line, job control is active). Testing this without using a script failed.
Also, as noted by Gareth Rees in his answer, you can sometimes use a here string:
while read i; do echo $i; done <<< "$FILECONTENT"
This doesn't require shopt; you may be able to save a process using it.
Jonathan Leffler explains how to do what you want using process substitution, but another possibility is to use a here string:
while read i; do echo "$i"; done <<<"$FILECONTENT"
This saves a process.
This function makes duplicates $NUM times of jpg files (bash)
function makeDups() {
NUM=$1
echo "Making $1 duplicates for $(ls -1 *.jpg|wc -l) files"
ls -1 *.jpg|sort|while read f
do
COUNT=0
while [ "$COUNT" -le "$NUM" ]
do
cp $f ${f//sm/${COUNT}sm}
((COUNT++))
done
done
}

bash, RANDOM generator with while read [duplicate]

I have a Bash script where I want to count how many things were done when looping through a file. The count seems to work within the loop but after it the variable seems reset.
nKeys=0
cat afile | while read -r line
do
#...do stuff
let nKeys=nKeys+1
# this will print 1,2,..., etc as expected
echo Done entry $nKeys
done
# PROBLEM: this always prints "... 0 keys"
echo Finished writing $destFile, $nKeys keys
The output of the above is something alone the lines of:
Done entry 1
Done entry 2
Finished writing /blah, 0 keys
The output I want is:
Done entry 1
Done entry 2
Finished writing /blah, 2 keys
I am not quite sure why nKeys is 0 after the loop :( I assume it's something basic but damned if I can spot it despite looking at http://tldp.org/HOWTO/Bash-Prog-Intro-HOWTO-7.html and other resources.
Fingers crossed someone else can look at it and go "well duh! You have to ..."!
In the just-released Bash 4.2, you can do this to prevent creating a subshell:
shopt -s lastpipe
Also, as you'll probably see at the link Ignacio provided, you have a Useless Use of cat.
while read -r line
do
...
done < afile
As mentioned in the accepted answer, this happens because pipes spawn separate subprocesses. To avoid this, command grouping has been the best option for me. That is, doing everything after the pipe in a subshell.
nKeys=0
cat afile |
{
while read -r line
do
#...do stuff
let nKeys=nKeys+1
# this will print 1,2,..., etc as expected
echo Done entry $nKeys
done
# PROBLEM: this always prints "... 0 keys"
echo Finished writing $destFile, $nKeys keys
}
Now it will report the value of $nKeys "correctly" (i.e. what you wish).
I arrived at the desired result in the following way without using pipes or here documents
#!/bin/sh
counter=0
string="apple orange mango egg indian"
str_len=${#string}
while [ $str_len -ne 0 ]
do
c=${string:0:1}
if [[ "$c" = [aeiou] ]]
then
echo -n "vowel : "
echo "- $c"
counter=$(( $counter + 1 ))
fi
string=${string:1}
str_len=${#string}
done
printf "The number of vowels in the given string are : %s "$counter
echo

Continuous piping in bash program

I'm looking for a way to pipe text continuously into a process like write. I do not want to buffer and pipe it all at once. This is my bash script so far:
#!/bin/bash
for i in `seq 1 10`; do
echo $i | write user
done
The problem is that write gets opened and closed i times. Does anyone know how I can keep it alive while looping?
Sure, just move the pipe outside the loop:
#!/bin/bash
for i in `seq 1 10`; do
echo "$i"
done | write user
As you've tagged with bash, I would suggest using a brace expansion for i in {1..10} rather than calling seq. If the numbers are variable, you can use something like for (( i = a; i < b; ++i )).

Distributing lines of commands using bash script

I'm trying to implement trivial parallelization where lines of commands are distributed among separate processes. For that purpose I wrote this script that I named jobsel:
(only "#! /bin/bash" and help message is omitted)
slots=$1
sel=$2
[[ $slots -gt 0 ]] || die_usage
[[ $sel -lt $slots ]] || die_usage
i=0
while read line
do
(( i % slots == sel )) && eval $line
i=$(( i + 1 ))
done
# in case the last line does not end with EOL
if [[ $line != "" ]]; then
(( i % slots == sel )) && eval $line
i=$(( i + 1 ))
fi
I put eval because I couldn't use redirection or pipe in the commands without it.
When I run this like $HOME/util/jobsel 22 0 < cmds on a console emulator when cmds is a file that contains lines like echo 0 >> out with increasing numbers, it outputs, as expected, 0, 22, 44... in separate lines. Good so far.
So I put this to work. But when I ran this through secure shell, I ran it through at with backgrounding (ending each line with &). Then there is a problem. When I entered 8 lines, 21 processes started! ps -AFH printed processes with identical command and different pIDs. All work processes were at the same level, directly under init. My program does not create child processes anyway.
Puzzled, I tried the echo 0 >> out script through at then the output contained duplicate lines. Still finding it hard to believe, and thinking simultaneous appending might have caused the anomaly, I used other methods to confirm that some lines were run multiple times.
Moreover, there was no such anomaly when everything was run in terminal or when I created separate at job for each worker process.
But how can this happen? Is something wrong with my script? Does at/atd have some bug?
GNU parallel is what you are trying to implement.
Hint : it works great combined to xargs

Resources