I have a Bash script where I want to count how many things were done when looping through a file. The count seems to work within the loop but after it the variable seems reset.
nKeys=0
cat afile | while read -r line
do
#...do stuff
let nKeys=nKeys+1
# this will print 1,2,..., etc as expected
echo Done entry $nKeys
done
# PROBLEM: this always prints "... 0 keys"
echo Finished writing $destFile, $nKeys keys
The output of the above is something alone the lines of:
Done entry 1
Done entry 2
Finished writing /blah, 0 keys
The output I want is:
Done entry 1
Done entry 2
Finished writing /blah, 2 keys
I am not quite sure why nKeys is 0 after the loop :( I assume it's something basic but damned if I can spot it despite looking at http://tldp.org/HOWTO/Bash-Prog-Intro-HOWTO-7.html and other resources.
Fingers crossed someone else can look at it and go "well duh! You have to ..."!
In the just-released Bash 4.2, you can do this to prevent creating a subshell:
shopt -s lastpipe
Also, as you'll probably see at the link Ignacio provided, you have a Useless Use of cat.
while read -r line
do
...
done < afile
As mentioned in the accepted answer, this happens because pipes spawn separate subprocesses. To avoid this, command grouping has been the best option for me. That is, doing everything after the pipe in a subshell.
nKeys=0
cat afile |
{
while read -r line
do
#...do stuff
let nKeys=nKeys+1
# this will print 1,2,..., etc as expected
echo Done entry $nKeys
done
# PROBLEM: this always prints "... 0 keys"
echo Finished writing $destFile, $nKeys keys
}
Now it will report the value of $nKeys "correctly" (i.e. what you wish).
I arrived at the desired result in the following way without using pipes or here documents
#!/bin/sh
counter=0
string="apple orange mango egg indian"
str_len=${#string}
while [ $str_len -ne 0 ]
do
c=${string:0:1}
if [[ "$c" = [aeiou] ]]
then
echo -n "vowel : "
echo "- $c"
counter=$(( $counter + 1 ))
fi
string=${string:1}
str_len=${#string}
done
printf "The number of vowels in the given string are : %s "$counter
echo
Related
How do I pass a list to for in bash?
I tried
echo "some
different
lines
" | for i ; do
echo do something with $i;
done
but that doesn't work. I also tried to find an explanation with man but there is no man for
EDIT:
I know, I could use while instead, but I think I once saw a solution with for where they didn't define the variable but could use it inside the loop
for iterates over a list of words, like this:
for i in word1 word2 word3; do echo "$i"; done
use a while read loop to iterate over lines:
echo "some
different
lines" | while read -r line; do echo "$line"; done
Here is some useful reading on reading lines in bash.
This might work but I don't recommend it:
echo "some
different
lines
" | for i in $(cat) ; do
...
done
$(cat) will expand everything on stdin but if one of the lines of the echo contains spaces, for will think that's two words. So it might eventually break.
If you want to process a list of words in a loop, this is better:
a=($(echo "some
different
lines
"))
for i in "${a[#]}"; do
...
done
Explanation: a=(...) declares an array. $(cmd...) expands to the output of the command. It's still vulnerable for white space but if you quote properly, this can be fixed.
"${a[#]}" expands to a correctly quoted list of elements in the array.
Note: for is a built-in command. Use help for (in bash) instead.
This seems to work :
for x in $(echo a b c); do
echo $x
done
This is not a pipe, but quite similar:
args="some
different
lines";
set -- $args;
for i ; do
echo $i;
done
cause for defaults to $# if no in seq is given.
maybe you can shorten this a bit somehow?
There are 2 pieces of code here, and the value in $1 is the name of a file which contains 3 lines of text.
Now, I have a problem. In the first piece of the code, I can't get the "right" value out of the loop, but in the second piece of the code, I can get the right result. I don't know why.
How can I make the first piece of the code get the right result?
#!/bin/bash
count=0
cat "$1" | while read line
do
count=$[ $count + 1 ]
done
echo "$count line(s) in all."
#-----------------------------------------
count2=0
for var in a b c
do
count2=$[ $count2 + 1 ]
done
echo "$count2 line(s) in all."
This happens because of the pipe before the while loop. It creates a sub-shell, and thus the changes in the variables are not passed to the main script. To overcome this, use process substitution instead:
while read -r line
do
# do some stuff
done < <( some commad)
In version 4.2 or later, you can also set the lastpipe option, and the last command
in the pipeline will run in the current shell, not a subshell.
shopt -s lastpipe
some command | while read -r line; do
# do some stuff
done
In this case, since you are just using the contents of the file, you can use input redirection:
while read -r line
do
# do some stuff
done < "$file"
I have a number of bash scripts, each doing its own thing merrily. Do note that while I program in other languages, I only use Bash to automate things, and am not very good at it.
I'm now trying to combine a number of them to create "meta" scripts, if you will, which use other scripts as steps. The problem is that I need to parse the output of each step to be able to pass a part of it as params to the next step.
An example:
stepA.sh
[...does stuff here...]
echo "Task complete successfuly"
echo "Files available at: $d1/$1"
echo "Logs available at: $d2/$1"
both the above are paths, such as /var/www/thisisatest and /var/log/thisisatest (note that files always start with /var/www and logs always start with /var/log ). I'm only interested in the files path.
steB.sh
[...does stuff here...]
echo "Creation of $d1 complete."
echo "Access with username $usr and password $pass"
all variables here are simple strings, that may contain special characters (no spaces)
What I'm trying to build is a script that runs stepA.sh, then stepB.sh and uses the output of each to do its own stuff. What I'm currently doing (both above scripts are symlinked to /usr/local/bin without the .sh part and made executable):
#!/bin/bash
stepA $1 | while read -r line; do
# Create the container, and grab the file location
# then pass it to then next pipe
if [[ "$line" == *:* ]]
then
POS=`expr index "$line" "/"`
PTH="/${line:$POS}"
if [[ "$PTH" == *www* ]]
then
#OK, have what I need here, now what?
echo $PTH;
fi
fi
done
# Somehow get $PTH here
stepB $1 | while read -r line; do
...
done
#somehow have the required strings here
I'm stuck in passing the PTH to the next step. I understand this is because piping runs it in a subshell, however all examples I've seen refer to to files and not commands, and I could not make this to work. I tried piping the echo to a "next step" such as
stepA | while ...
echo $PTH
done | while ...
#Got my var here, but cannot run stuff
done
How can I run stepA and have the PTH variable available for later?
Is there a "better way" to extract the path I need from the output than nested ifs ?
Thanks in advance!
Since you're using bash explicitly (in the shebang line), you can use its process substitution feature instead of a pipe:
while read -r line; do
if [[ "$line" == *:* ]]
.....
fi
done < <(stepA $1)
Alternately, you could capture the command's output to a string variable, and then parse that:
output="$(stepA $1)"
tmp="${output#*$'\nFiles available at: '}" # output with everything before the filepath trimmed
filepath="${tmp%%$'\n'*}" # trim the first newline and everything after it from $tmp
tmp="${output#*$'\nLogs available at: '}"
logpath="${tmp%%$'\n'*}"
I need to know if it is possible to mark a bash script line number and then restart that script at the saved line number.
Code:
#!/bin/bash
while read -r line; do #I'm reading from a big wordlist
command1 using $line
command2 using $line
done
Specifically, is there a way to write the current $line number of the script automatically into a separate text file in order for the script to start from the line number specified, so that I won't have to start everything from scratch in case I have to stop the script?
Does it make sense?
Thank you very much !
This may help:
#!/bin/bash
TMP_FILE="/tmp/currentLineNumber" # a constant
current_line_count=0 # track the current line number
processed_lines_count=0
# Verify if we have already processed some stuff.
if [ -r "${TMP_FILE}" ]; then
processed_lines_count=$(cat ${TMP_FILE})
fi
while read -r line; do # I 'm reading from a big wordlist
# Skip processing till we reach the line that needs to be processed.
if [ $current_line_count -le $processed_line_count ]; then
# do nothing as this line has already been processed
current_line_count=$((current_line_count+1)) # increment the counter
continue
fi
current_line_count=$((current_line_count+1))
echo $current_line_count > ${TMP_FILE} # cache the line number
# perform your operations
command1 using $line
command2 using $line
done
This should work:
#!/bin/bash
I=`cat lastline`;
A=0;
while read -r line; do
if [$A>=$I]; then
command1 using $line
command2 using $line
(( I++ ))
echo "$I" > "lastline";
fi;
(( A++ ))
done
Remember you will have to delete lastline if you want to restart. :-)
The bash-only solutions are nice, but you may get better performance by using other tools to streamline your restart. Like the script in your question, the following takes the wordlist on stdin.
#!/bin/sh
# Get the current position, or 0 if we haven't run before
if [ -f /tmp/processed ]; then
read processed < /tmp/processed
else
processed=0
fi
# Skip up to the current position
awk -v processed="$processed" 'NR > processed' | while read -r line; do
# Run your commands
command1 using $line
command2 using $line
# Record our new position
processed=$((processed + 1))
echo $processed > /tmp/processed
done
Oh, and the way I wrote this, it's Bourne shell compatible, so it doesn't require bash.
I'm a C/C++ programmer and quite stupid in general (or at least the way bash does things it makes me feel confused). I can't wrap my head around process substitution.
I need to define a global boolean, set it somewhere in a loop, and make use of it in global scope. Could someone please explain in the simplest way possible how to adapt the code below to allow me to achieve my use case, simple enough so that I don't have to contort my brain again tomorrow to try and grasp process substitution .
# DEFINE HERE
for i in `seq 0 ${DAEMON_COUNT}`;
do
if [ ! -d "data$i" ]; then
# SET HERE
echo "data$i does not exist. Creating...";
mkdir data$i
fi
done
# TEST AND USE HERE
to be honest, I don't think bash is up to the task.... the next block looks like this.
echo "-------------------------------------------------------------------------------"
echo "checking the state of potentially running daemons"
for i in `seq 0 ${DAEMON_COUNT}`;
do
if [ ! -e "data$i/mongod.lock" ] ; then
echo "[no lock file] mongod process $i does not exist"
else
echo "[lock file exists] process $i lock file exists "
I_PID=`cat data$i/mongod.lock`
if [ ! ${I_PID} ]; then
echo " [GOOD] lock pid empty"
elif [ "`ps -p ${I_PID} | grep ${I_PID}`" ]; then
echo " [GOOD] data1 pid: ${I_PID} running"
else
echo "[PROBABLY FATAL] data1 pid: ${I_PID} not running."
fi
fi
done
echo "-------------------------------------------------------------------------------"
What I now need is a global array of structs so that I can loop over them and take conditional action to initialize my daemons correctly :/.
Might just use libc and do this stuff in lua, the only reason I hold back is having to install rocks, I don't like ad-hoc code repositories vomiting whatever they want onto my machine :D
The important thing to understand is this: child process is born with its own environment and cannot affect the variables of its parent. If you set a variable in a child process, then the value of the variable in the parent is not affected. These are actually two different variables which just happen to have the same name.
The second thing to understand is when bash runs a command as a child process. There are two cases relevant to the question:
Each process connected by a pipe | is a child of the current shell.
Running a single builtin command with a redirection (e.g. <) will not spawn a child process.
Here is a simple session which demonstrates these ideas:
$ somevar=initial
$ echo test1 | read somevar
$ echo $somevar
initial
$ read somevar < <(echo test2)
$ echo $somevar
test2
The first read is a child process and therefore somevar in the main shell does not change. The second read is executed by the main shell itself and hence somevar is updated.
This means that your code will work as you expect unless you add a pipe in front of or after the for loop, i.e. this works as you want it to:
# DEFINE HERE
SOMEVAR=0
DAEMON_COUNT=10
for i in `seq 0 ${DAEMON_COUNT}`;
do
if [ ! -d "data$i" ]; then
# SET HERE
SOMEVAR=10
echo "data$i does not exist. Creating...";
mkdir data$i
fi
done
# TEST AND USE HERE
echo ${SOMEVAR} # This displays 10
I might have misundestood but...
bool=false;
for i in `seq 0 ${DAEMON_COUNT}`;
do
if [ ! -d "data$i" ]; then
bool=true;
echo "data$i does not exist. Creating...";
mkdir data$i
fi
done
if [ $bool = true ]; then
...
fi
Is this what you want?