Shell script to read and write to a file - shell

I have a script which contains the file location and i am running a command to fix the files.
But i am not able to write the output to a file.
#!/bin/ksh
while read line
do
`shnfix "$line"` >> output.txt
done < filename.txt
The output generated after adding set -x in the beginning of the script.
+ < filename.txt
+ read line
+ shnfix /x01/naveen_wav/file1.wav
+ >> output.txt
Fixing [/x01/naveen_wav/file1.wav] (3:49.42) --> [file1-fixed.wav] : 100% OK
Padded last file with 1194 zero-bytes.
+ read line
+ shnfix /x01/naveen_wav/file2.wav
+ >> output.txt
Fixing [/x01/naveen_wav/file2.wav] (4:30.35) --> [file2-fixed.wav] : 100% OK
Padded last file with 644 zero-bytes.
+ read line

A more efficient version (I/O wise) of #gile's code:
#!/bin/ksh
filename="/path/to/filename.txt"
while IFS= read -r line
do
shnfix "$line"
done < filename.txt > output.txt

The output should be inside `
`shnfix $line >> output.txt`
So the script could be like this:
#!/bin/ksh
filename="/path/to/filename.txt"
while IFS= read -r line
do
# display line or do somthing on $line
echo "$line"
`shnfix $line >> output.txt`
done <"$fileName"

Just remove the backticks, they are at least confusing.
Written as it is the shell will exeute the command in a subshell and try to execute the result, adding the redirection to it.
I assume you don't want to execute the output, but you want to redirect the output.
If the output starts with, for example line, that is a correct unix command, which does not create output you don't see an error, but neither the output.
I get a test.ksh[4]: line1: not found [No such file or directory] where 'line1' is the first line in my test file.
Or keep it in a block and redirect all output from it. Makes the intention more clear and it is easier to add commnds.
#!/bin/ksh
filename="/path/to/filename.txt"
{
while IFS= read -r line
do
shnfix "$line"
done < "$filename"
} > output.txt
http://www.shellcheck.net (a big troubleshooter) will give similar hints

Related

simple 'printf' in bash gets screwed up output

I'm a newbie here. I've really tried to google it but failed.
I've got a simple script that reads from file line by line and then prints it. "b.txt" has the first line of "02083192846".
#!/bin/bash
while IFS= read -r line; do
printf 'downloading %s .html\n' $line
done < "$1"
however, the output is screwed up:
User#User-pk ~/test
$ ./test.sh b.txt
.htmlading 02083192846
Once this is fixed, my next question is how to use this line as part of the file name that some command will store into. i.e. the filename should be "02083192846.html". I've tried setting using it as ${line}.html but it doesn't work. For example grep "foo" -f ${line}.html doesn't work. But curl http://foo -o $line.html does work still!
What I would do :
#!/bin/bash
while IFS= read -r line; do
printf 'downloading %s .html\n' "${line//$'\r'/}"
#  ________
done < "$1"
#  ^
# remove \r with bash parameter expansion

Reading a file line by line from variable

I'm working on a script and it isn't clear how read -r line knows which variable to get the data from.. I want to read line by line from the FILE variable.
Here is the script I'm working on:
#!/bin/bash
cd "/"
FILE="$(< /home/FileSystemCorruptionTest/*.chk)"
while read -r line
do
echo "$line" > /home/FileSystemCorruptionTest/`date +%Y_%m_%d_%H_%M_%S`_1.log
done
echo "" > /home/FileSystemCorruptionTest/Done
Since it looks like you want to combine multiple files, I guess that I would regard this as a legitimate usage of cat:
cat /home/FileSystemCorruptionTest/*.chk | while read -r line
do
echo "$line"
done > /home/FileSystemCorruptionTest/`date +%Y_%m_%d_%H_%M_%S`_1.log
Note that I moved the redirect out of the loop, to prevent overwriting the file once per line.
Also note that your example could easily be written as:
cat /home/FileSystemCorruptionTest/*.chk > /home/FileSystemCorruptionTest/`date +%Y_%m_%d_%H_%M_%S`_1.log
If you only actually have one file (and want to store it inside a variable), then you can use <<< after the loop:
while read -r line
do
echo "$line"
done <<<"$FILE" > /home/FileSystemCorruptionTest/`date +%Y_%m_%d_%H_%M_%S`_1.log
<<< "$FILE" has the same effect as using echo "$FILE" | before the loop but it doesn't create any subshells.
What you are requesting:
echo "${FILE}" | while read -r line …
But I think Tom's solution is better.

access to two or more file via read command on bash

please look at here to read from a file line by line,So I use the first answer and works:
First answer:
The following (save as rr.sh) reads a file from the command line:
#!/bin/bash
while read line
do
name=$line
echo "Text read from file - $name"
done < $1
Run the script as follows:
chmod +x rr.sh
./rr.sh filename.txt
My question is If I want to read another file line by line , What I should do ? For example this code :
The following (save as rr.sh) reads a file from the command line:
#!/bin/bash
#this read refers to filename.txt
while read line
do
name=$line
echo "Text read from file - $name"
done < $1
.
.
.
#this read refers to file2.txt
while read line
do
name=$line
echo "Text read from file - $name"
done < $1
Run the script as follows:
How I can write this?
Your use of $1 implies the name of the file to read from is passed as the first argument to your script. To read from two different files, pass both names as arguments:
rr.sh file1.txt file2.txt
and use $2 for the input redirection to the second loop.
#!/bin/bash
while read line
do
echo "Text read from file - $line"
done < "$1"
while read line
do
echo "Text read from file - $line"
done < "$2"
You can refer to as many command line arguments as you like. After the ninth argument ($9), you must use braces to enclose the number (so for the tenth, use ${10}).
what you need to do is to add a 'for' loop:
#!/bin/bash
for file; do
while read line; do
echo "Text read from $file - $line"
done < "$file"
done
then run it as: ./rr.sh file1 file2 file3 otherfile*

Bash: while terminate early when reading from file

I am trying to run a file that contains a sequence of commands/scripts to run with arguments, like:
ls /etc/
cat /etc/hosts/
script.sh some parameters
...
This seems to work fine but in some cases the while loop will end prematurely. This seems to be the case only when the scripts it is executing contains SSH/SCP at the end. The code to read the file:
while IFS= read -r line
do
# Cut command and parameters
IFS=', ' read -a parameters <<< "$line"
cmd="${parameters[0]}"
unset parameters[0]
runScriptAndCheckError "$cmd" "${parameters[#]}"
done < "$SCRIPT_FILENAME"
When using set -x:
+ checkError 0 'ERROR: script.sh failed'
+ '[' 0 -ne 0 ']'
+ IFS=
+ read -r line
It looks like there is no more input although there is still lines in the file. If I comment out runScriptAndCheckError "$cmd" "${parameters[#]}" then it does print a lot more lines.
I am not sure what is wrong with this code. I'd be really helpful if someone could please help.
If runScriptAndCheckError also reads from standard input, it will read lines from $SCRIPT_FILENAME. Have the read command in the while loop use a different file descriptor:
while IFS= read -r line <&3; do
...
done 3< "$SCRIPT_FILENAME"
Stealing Input
The key is as #chepner states, that the statements inside:
read line; do
<command>
done
Will interfere with the loop if they attempt to read from stdin.
If you don't want your script reading from stdin, prevent it from doing so as follows:
read line; do
cmd < /dev/null
done
Now your read loop will not be missing any input.

Read file line by line and perform action for each in bash

I have a text file, it contains a single word on each line.
I need a loop in bash to read each line, then perform a command each time it reads a line, using the input from that line as part of the command.
I am just not sure of the proper syntax to do this in bash. If anyone can help, it would be great. I need to use the line from the test file obtained as a paramter to call another function. The loop should stop when there are no more lines in the text file.
Psuedo code:
Read testfile.txt.
For each in testfile.txt
{
some_function linefromtestfile
}
How about:
while read line
do
echo $line
// or some_function "$line"
done < testfile.txt
As an alternative, using a file descriptor (#4 in this case):
file='testfile.txt'
exec 4<$file
while read -r -u4 t ; do
echo "$t"
done
Don't use cat! In a loop cat is almost always wrong, i.e.
cat testfile.txt | while read -r line
do
# do something with "$line" here
done
and people might start to throw an UUoCA at you.
while read line
do
nikto -Tuning x 1 6 -h $line -Format html -o NiktoSubdomainScans.html
done < testfile.txt
Tried this to automate nikto scan of list of domains after changing from cat approach. Still just read the first line and ignored everything else.

Resources