Shell Script writing output to a file [duplicate] - shell

This question already has answers here:
Output for loop to a file
(4 answers)
Loop and append/write to the same file without overwriting
(2 answers)
Closed 5 years ago.
I have a shell script that extracts certain values from a text file (input to it via the terminal). The script does the extraction as intended except, it doesn't print the output to the file correctly.
The script is:
#!/bin/bash
input_file=$1
while read -r LINE
do
IFS="=" read -r -a params <<< "$LINE"
if [ -n "${params[2]}" ]
then
IFS=" " read -r -a param_opcode <<< "${params[2]}"
echo "${param_opcode}"
fi
done < "$input_file"
The output on the terminal is as follows:
0xd2800140
0xd2800061
0x8b010000
0x8b000042
0xd1000821
0xd28001e5
0xd28000a6
0x9ac608a5
0xe7ff0010
0xe7ff0010
However, when I try to write this to a text file bu doing:
echo "${param_opcodes}" > log.txt
It prints only this to the file:
0xe7ff0010
I tried >> but I don't want to append to it. I want the file to be overwritten every time, I run the script.

Your redirection to log file isn't right because it's inside the while loop.
Use the redirection at the end of the while loop:
while read -r LINE; do
...
done < "$input_file" > "log.txt"

Related

How to read environment variable from the text file? [duplicate]

This question already has answers here:
Forcing bash to expand variables in a string loaded from a file
(13 answers)
Closed 1 year ago.
Hi all, I'm facing an issue that I cant read the environment variable from the text file.
Here is the content of the text file:
Blockquote
#INCLUDE
$ward/ancd/qwe
.........
.........
And the bash script
while IFS= read -r line
do
echo "$line" # It shows $ward/ancd/qwe instead of tchung/folder/main/ancd/qwe
done < "$input"
Blockquote
It should directly shows "tchung/folder/main/ancd/qwe" when echo, but it outputs $ward/ancd/qwe.
The $ward is an environment variable and it able to shows the file path in bash when echo directly. But when comes to reading text file it cant really recognize the environment variable.
Blockquote
The current solution that i can think off is replace the matched $ENVAR in the $line with the value.
repo="\$ward"
if echo "$line" | grep -q "$repo"
then
# echo "match"
line="${line/$repo/$ward}"
#echo "Print the match line : $line"
fi
Is there any other more flexible way that can recognize the environment variables during reading file without replacing the substring one by one?
Perhaps you need to evaluate the content of $line within an echo:
while IFS= read -r line
do
echo $(eval "echo $line")
done
Use envsubst:
envsubst "$input"

While loop only operating on first line of file in bash [duplicate]

This question already has answers here:
Shell script while read loop executes only once
(6 answers)
While loop stops reading after the first line in Bash
(5 answers)
Bash script stops execution of ffmpeg in while loop - why?
(3 answers)
Closed 5 years ago.
I have a while loop that should iterate over a text file but stops at the first line and I can't figure out why. My code is below.
while read hadoop_accounts; do
if ! grep "no lock no remove"; then
echo "${hadoop_accounts%%:*}"
echo "${hadoop_accounts%%:*}" >> missing_no_lock_no_remove.txt
fi
done < hadoop_accounts.txt
When grep is run with no explicit redirection or file to read, it reads stdin. All of stdin. The same stdin your while read loop is reading from.
Thus, with all your stdin consumed by grep, there's nothing left for the next read command to consume.
The easy approach (and much better for performance) is to do the substring check internal to the shell, and not bother starting up a new copy of grep per line processed at all:
while IFS= read -r hadoop_accounts; do
if ! [[ $hadoop_accounts = *"no lock no remove"* ]]; then
echo "${hadoop_accounts%%:*}"
echo "${hadoop_accounts%%:*}" >&3
fi
done < hadoop_accounts.txt 3>> missing_no_lock_no_remove.txt
Note also that we're only opening the output file once, not re-opening it every single time we want to write a single line.
If you really want to call grep over and over and over with only a single line of input each time, though, you can do that:
while IFS= read -r hadoop_accounts; do
if ! grep "no lock no remove" <<<"$hadoop_accounts"; then
echo "${hadoop_accounts%%:*}"
echo "${hadoop_accounts%%:*}" >&3
fi
done < hadoop_accounts.txt 3>> missing_no_lock_no_remove.txt
Or, even better than either of the above, you can just run grep a single time over the entire input file and read its output in the loop:
while IFS= read -r hadoop_accounts; do
echo "${hadoop_accounts%%:*}"
echo "${hadoop_accounts%%:*}" >&3
done < <(grep -v 'no lock no remove' <hadoop_accounts.txt) 3>>missing_flags.txt

Script: echoing lines from one file to another doesn't print '\t'. Issue [duplicate]

This question already has answers here:
Preserving leading white space while reading>>writing a file line by line in bash
(5 answers)
Closed 6 years ago.
I need to create a file by modifying some lines of a source one.
I developed a loop 'while read line; do'. Inside it, the lines I read and don't modify go just:
echo -e "$line" >> "xxxx.c"
My issue is that some of that lines start with '\t', and they won't print the output file.
Example:
while read line;
do
if echo "$line" | grep -q 'timeval TIMEOUT = {25,0};'
then
echo "$line"
fi
Any help? I've tried with the printf command also but without success.
In that case you could just remove "-e" argument from the echo command.
From echo man page:
-e enable interpretation of backslash escapes

Execute script over SSH and get output? [duplicate]

This question already has answers here:
Running a Bash script over ssh
(5 answers)
Closed 6 years ago.
I am trying to execute a command in a remote machine and get the output.
I have tried implementing below shell script but unable to get the content.
#!/bin/bash
out=$(ssh huser#$source << EOF
while IFS= read -r line
do
echo 'Data : ' $line
done < "data.txt"
EOF
)
echo $out
Output:
Data : Data : Data :
I could see the "Data :" is printed 3 times because the file "data.txt" has 3 lines of text.
I can't use scp command to get the file directly because I might have to run some command in the place of text file.
Can someone help me in finding the issue?
Thanks in advance.
The problem doesn't have anything to do with ssh at all:
echo $out
is mangling your data. Use quotes!
echo "$out"
Similarly, you need to quote your heredoc:
out=$(ssh huser#$source <<'EOF'
while IFS= read -r line; do
printf 'Data : %s\n' "$line"
done < "data.txt"
EOF
)
Using <<'EOF' instead of <<EOF prevents $line from being expanded locally, before the code has been sent over SSH; this local expansion was replacing echo 'Data : ' $line with echo 'Data : ', because on your local system the line variable is unset.

Passing file to shell script multiple times [duplicate]

This question already has answers here:
rewinding stdin in a bash script
(4 answers)
Closed 7 years ago.
I have a shell script where I pass a txt file to the script as follows:
./run.sh < list.txt
Within the script, I am doing a "while read LIST do ... end"
It all works well, and the script executes using the list.
However, now I want to have a second while read LIST do ... end in the same shell script. I want it to read again from the original list I'm passing it on execution, but it doesn't work. It reads the list.txt file for the first loop, but not the second.
What do I do to make the script read list.txt each time I'm asking for it?
You can't read stdin twice. Try passing list.txt on the command-line rather than redirecting it.
./run.sh list.txt
Then in your script:
while read LINE; do
...
done < "$1"
while read LINE; do
...
done < "$1"
Alternatively, save the contents of stdin off the first time you read through it. For instance:
# First loop, save stdin in an array.
LINES=()
while read LINE; do
LINES+=("$LINE")
...
done
# Second loop, iterate over the array.
for LINE in "${LINES[#]}"; do
...
done

Resources