Truncating File in Bash - bash

I am writing the output of a command into a bash file. The command gradually produces output, and I am using grep to retrieve part specific lines, and tee to write it to the file. Right now, the command is writing all the lines into the file. I want the file to be truncated everytime the bash command has some output, such that there is always one line in the file. How can I achieve such an effect?
The command I am using is:
2>&1 zypper -x -n in geany | grep -o --line-buffered "percent=\"[0-9]*\"" | tee /var/log/oneclick.log
This produces output like percent="10" and so on. Each time, only one line should exist in the file

If you need to overwrite the file for each line:
2>&1 zypper -x -n in geany |
grep -o --line-buffered "percent=\"[0-9]*\"" |
while read line; do
echo "$line" > /var/log/oneclick.log
echo "$line"
done

Related

Cat command with pipe working in bash but not in script

Pretty simple script, but I am stuck.
It connects to a battery balancer, spits out the info into a json formatted file. I then have a pipe the output into jq to obtain the info I need.
It works in the bash shell, but not in the script:
Here is the script:
echo "Checking battery voltages"
jkbms -p 3C:A5:19:7B:28:09 -o json > /home/bms/batt.log
echo cat /home/bms/batt.log | jq -r '.highest_cell_voltage'
echo "done"
The cat line shows this in the script output:
Checking battery voltages
parse error: Invalid numeric literal at line 1, column 4
done
From the shell it works as expected:
cat /home/bms/batt.log | jq -r '.highest_cell_voltage'
4.152044773101807
I have tried enclosing the whole cat command in quotes etc, but I am at a loss.
This, however, works:
echo "Checking battery voltages"
jkbms -p 3C:A5:19:7B:28:09 -o json > /home/bms/batt.log
batt=$(cat /home/bms/batt.log)
echo $batt | jq -r '.highest_cell_voltage'
#echo /usr/bin/cat /home/bms/batt.log
echo "done"
jkbms -p 3C:A5:19:7B:28:09 -o json > /home/bms/batt.log
echo cat /home/bms/batt.log | jq -r '.highest_cell_voltage'
The echo here is wrong. By the way, you can simplify the above to:
jkbms -p 3C:A5:19:7B:28:09 -o json|tee /home/bms/batt.log|jq -r '.highest_cell_voltage'
If I need to print the output of the comand on the screen, how do i do it without using echo?
If you want the saved output in /home/bms/batt.log, you can cat /home/bms/batt.log anytime.
If you want to print the output of the comand on the screen only at the time of execution, you can tee /dev/tty instead of tee /home/bms/batt.log.
If at the time of execution you want the output on screen as well as in the log file, you can tee /home/bms/batt.log /dev/tty at once.

Bash output from expect script to two different files

I am trying to output to two different files using tee. My first file will basically be tail -f /myfile and my second output will be a subset of the first file. I have looked online that they were saying we can use `|
tee >(proc1) >(proc2)
I have tried the above but both my files are blank.
Here is what i have so far:
myscript.sh
ssh root#server 'tail -f /my/dir/text.log' | tee >(/mydir/my.log) >(grep 'string' /mydir/my.log > /mydir/mysecond.log)
myexpect.sh
#!/usr/bin/expect -f
set pass password
spawn /my/dir/myexpect.sh
expect {
"key fingerprint" {send "yes/r"; exp_contiue}
"assword: " {send "$pass\r"}
}
interact
In your script, there are some problems in the usage of tee,
tee >(/mydir/my.log): can be substitute with tee /mydir/my.log, since tee would write to stdout and files, i.e. /mydir/my.log
grep 'string' /mydir/my.log > /mydir/mysecond.log: as I mentioned, tee would also write to stdout, so no need to grep the string from file, you can grep from stdout directly. Use pipeline to do it.
So the whole command shall be modified as followed,
ssh root#server 'tail -f /my/dir/text.log | tee /mydir/my.log | grep --line-buffered "string" > /mydir/mysecond.log'
Edit:
For your further question
The command would hang because of tail -f was still waiting for output the growing file. If you don't want the command hanged, try to remove -f for tail.
Depends on the option -f existed for tail, you shall use two different way to allow the grep write file.
For tail case: grep can successfully write file
For tail -f case: --line-buffered for grep would use line buffering on output

Run commands in text file in bash

How can I run a command from a line inside a text file, one line of my text file looks like this
echo "RouterPing;`ping -c4 -w4 -q DeviceIP| tail -2 |awk '{print}' ORS=' '`;$(date)" >> somefile.txt &
I have a file that has thousands of lines that being generated by external program and want to execute every line in it. I need each line to be run exactly as if I am running it from bash shell
You can just run:
bash file.txt
you can use below , but i would highly not recommend executing 1000 commands from a file ,
#!/usr/bin/bash
filename="$1"
while read -r line
do
$line
done < "$filename"
How to use
./this_file_name.sh file_with_commands
you$ bash somefile.txt
Just make sure your file is executable (chmod 744 somefile.txt)
with dot ('.') (I removed the & because if the command runs in background it wont work)
echo "RouterPing;`ping -c4 -w4 -q DeviceIP| tail -2 |awk '{print}' ORS=' '`;$(date)" >> somefile.txt
. somefile.txt

Cannot redirect the ouput of hwclock -r command

I am implementing a shell script and I want to analyse the output shown by hwclock -r (--show) command which displays the RTC time and date.
To do that I tried things like: hwclock -r | grep -v "grep" | grep "error" > /dev/null
to see if an error happened while reading RTC registers.
The problem is that output is only and always forwarded to console. I tried to forward output to a file then analyse its content and I also tried to use tee -a command to direct output to both console and a file, but with no success.
Is there a solution to that or an explanation to what is happening with hwclock -r command.
In advance Thank you.
I just solved it by forwarding error messages to a file then make the analysis.
hwclock -r 2> file.txt; grep -v "grep" | grep "error" > /dev/null will do the job.
You omitted file.txt in the first grep.
If you just want to check for "error", with a not too old bash this will also do, in a shorter way:
hwclock -r |& grep error >/dev/null

Print STDOUT in the middle of 2 Pipes in Solaris(bash)

http://www.webdesignerdepot.com/rss.htm
I have the same issue. This command:
./somescript.sh > ../log/scriptlog.log
requires the output of a command go to std out. but inside the script
command | mailx -s "Subject" recipient#somedomain.tld
what I would like to do is something like :
command | tee > /dev/stdout | mailx -s "Subject" recipient#somedomain.tld
Where the output of the command goes to stdout( to be redirected into the ..log/scriptlog.log file )
and also into stdin for the mailx command.
Any way to do that?
tee already sends to stdout.
... | tee -a log/scriptlog.log | ...
exec 3>&1
command | tee /dev/fd/3 | mailx ...
or, using process substitution:
command | tee >(mailx ...)
I'll try process substitution. To clarifily, I have a cron'd shell script . The cron entry is similar to:
/usr/script/myscript.sh > /usr/log/myscript.log
inside the script is a line similar to:
command | mailx -s "Subject" recipient
Since stdout from 'command' is being piped into the mailx command, it does appear in the log file 'myscript.log', but I want it to.
I tried capturing it into a variable but the line feeds appear to be lost that way. I could use a temporary file, but I was hoping for something more elegant.

Resources