Executing sensors command (lm-sensor) in Lua - bash

I originally have this Lua script
function temp_watch()
warn_value=60
crit_value=80
temperature=tonumber(conky_parse("${hwmon 1 temp 1}"))
if cpu_tmp<warn_value then
settings_table[1]['fg_colour']=normal
elseif cpu_tmp<crit_value then
settings_table[1]['fg_colour']=warn
else
settings_table[1]['fg_colour']=crit
end
end
but for some reason, hwmon 1 temp 1 is just stuck reporting 25C. For this reason I switched to sensors. In conky I am executing it using this:
${exec sensors | grep 'Package id 0' | cut -d ' ' -f 5 | cut -c 2,3,4,5,6,7}
I tried using this solution: https://unix.stackexchange.com/questions/666250/how-to-use-conky-variable-with-external-command. Basically, replacing the temperature=tonumber... to
temperature=tonumber(conky_parse("${eval $${exec sensors | grep 'Package id 0' | cut -d ' ' -f 5 | cut -c 2,3,4,5,6,7}}"))
I also tried this: is it possible to pipe output from commandline to lua?. Replaced temperature=tonumber... to
local cpu_tmp = io.popen("exec sensors | grep 'Package id 0' | cut -d ' ' -f 5 | cut -c 2,3,4,5,6,7")
temperature=tonumber(cpu_tmp)
Both outputted this error:
llua_do_call: function conky_main execution failed: /home/joe/conky/conky-grapes/rings-v2_gen.lua:530: attempt to compare nil with number
am I missing some variable conversion or is there any other syntax to execute bash in lua?
Thanks in advance :-)

lua.org gave me an answer. The Complete I/O Model.
local file= io.popen("sensors -u | awk '/temp1_input:/ {print $2; exit}'")
local temperature = tonumber(file:read('*a'))
<SOME CODE HERE>
file:close()
end
It is not elegant but it seems to work. I will be running the lua script for quite some time to ensure it wont have that too many open files error
Many thanks to #Fravadona for pointing me in the right direction :-)

Related

display grid of data in bash

would like to get an opinion on how best to do this in bash, thank you
for x number of servers, each has it's own list of replication agreements and their status.. it's easy to run a few commands and get this data, ex;
get servers, output (setting/variable in/from a local config file);
. ./ldap-config ; echo "$MASTER $REPLICAS"
dc1-server1 dc1-server2 dc2-server1 dc2-server2 dc3...
for dc1-server1, get agreements, output;
ipa-replica-manage -p $(cat ~/.dspw) list -v $SERVER.$DOMAIN | grep ': replica' | sed 's/: replica//'
dc2-server1
dc3-server1
dc4-server1
for dc1-server1, get agreement status codes, output;
ipa-replica-manage -p $(cat ~/.dspw) list -v $SERVER.$DOMAIN | grep 'status: Error (' | sed -e 's/.*status: Error (//' -e 's/).*//'
0
0
18
so output would be several columns based on the 'get servers' list with each 'replica: status' under each server, for that server
looking to achieve something like;
dc2-server1: 0 dc2-server2: 0 dc1-server1: 0 ...
dc3-server1: 0 dc3-server2: 18 dc3-server1: 13 ...
dc4-server1: 18 dc4-server2: 0 dc4-server1: 0 ...
Generally eval is considered evil. Nevertheless, I'm going to use it.
paste is handy for printing files side-by-side.
Bash process substitutions can be used where you'd use a filename.
So, I'm going to dynamically build up a paste command and then eval it
I'm going to use get.sh as a placeholder for your mystery commands.
cmd="paste"
while read -ra servers; do
for server in "${servers[#]}"; do
cmd+=" <(./get.sh \"$server\" agreements | sed 's/\$/:/')"
cmd+=" <(./get.sh \"$server\" status)"
done
done < <(./get.sh servers)
eval "$cmd" | column -t

Replace value of a specific column from a file

I have file whose size is approx 1 GB and that file has a data in below format .
A|CD|44123|0|0
B|CD|44124|0|0
C|CD|44125|0|0
D|CD|44126|0|0
E|CD|44127|0|0
F|CD|44128|0|0
J|CD|44129|0|0
I|CD|44130|0|0
In this file I have to replace the third column value from a value which i will get after conversion . For which i have to open this file and then read the file and replace it . This process is taking around 5 hours .Below is the code which i am using
cat $FILE_NAME |\
while read REC
do
DATE=`echo "$REC" | cut -d\| -f3`
DATE_NEW=`$UTIL $DATE | head -1 |cut -d" " -f12`
RECORD="$DATE_NEW,"
echo "$RECORD" >> $New_File
done
Is there a way we can make this more better and fast.
Desired output will be like this where DATE_NEW value will be placed on each 3rd column DATE_NEW value will be the converted value which I will get from this
DATE_NEW=`$UTIL $DATE | head -1 |cut -d" " -f12`
A|CD|10/20/2020|0|0
B|CD|10/25/2020|0|0
C|CD|10/25/2020|0|0
D|CD|10/25/2020|0|0
E|CD|11/15/2020|0|0
F|CD|11/14/2020|0|0
J|CD|11/16/2020|0|0
I|CD|11/17/2020|0|0
After the comment from #Sundeep Why is using a shell loop to process text considered bad practice? I wrote the logic in Perl and from 5-7 hours processing time in Perl it took 99 Seconds to get the job done.
Give this a try:
awk -v cmd="Cmd2GetNEWDATE" 'BEGIN{FS=OFS="|"}{cmd|getline v;close(cmd)}$3=v' file

Get directory name with grep and remove it

please is there any simple way how can I get NAME output only from lines, where DATE < 5 days ago and then call other command called rm on these lines with NAME as argument?
I have the following output from mega-ls path/ -l (mega.nz) command:
FLAGS VERS SIZE DATE NAME
d--- - - 06Feb2020 05:00:01 bk_20200206050000
d--- - - 07Feb2020 05:00:01 bk_20200207050000
d--- - - 08Feb2020 05:00:01 bk_20200208050000
d--- - - 09Feb2020 05:00:01 bk_20200209050000
d--- - - 10Feb2020 05:00:01 bk_20200210050000
d--- - - 11Feb2020 05:00:01 bk_20200211050000
I tried grep, sort and other ways e.g. mega-ls path/ -l | head -n 5 but I don't know how to search these lines based on the date.
Thank you a lot.
I try find simple way for you request ;)
mega-ls path/ -l | head -n 5 | tr -s ' ' | cut -d ' ' -f6 | grep -v -e '^$' | grep '^bk_20200206.*' | xargs rm -f
Part 1 : This is you command (returned folders list by extra data)
mega-ls path/ -l | head -n 5
Part 2 : Try to remove extra space in your part 1 result
tr -s ' '
Part 3 : Try to use cut command to delimit result part 2 and return Name Folders column
cut -d ' ' -f6
Part 4 : Try to remove Empty lines from result part 3 (result of header line)
grep -v -e '^$'
Part 5 : This your request for search folders name by date yyyymmdd format example : 20200206 (replace 20200206 to your real date need)
grep '^bk_20200206.*'
Part 6 : (Very Important!!) If you need to delete result folders use this part (Very Important!!)
xargs rm -f
Best Regards

How to properly use the grep command to grab and store integers?

I am currently building a bash script for class, and I am trying to use the grep command to grab the values from a simple calculator program and store them in the variables I assign, but I keep receiving a syntax error message when I try to run the script. Any advice on how to fix it? my script looks like this:
#!/bin/bash
addanwser=$(grep -o "num1 + num2" Lab9 -a 5 2)
echo "addanwser"
subanwser=$(grep -o "num1 - num2" Lab9 -s 10 15)
echo "subanwser"
multianwser=$(grep -o "num1 * num2" Lab9 -m 3 10)
echo "multianwser"
divanwser=$(grep -o "num1 / num2" Lab9 -d 100 4)
echo "divanwser"
modanwser=$(grep -o "num1 % num2" Lab9 -r 300 7)
echo "modawser"`
You want to grep the output of a command.
grep searches from either a file or standard input. So you can say either of these equivalent:
grep X file # 1. from a file
... things ... | grep X # 2. from stdin
grep X <<< "content" # 3. using here-strings
For this case, you want to use the last one, so that you execute the program and its output feeds grep directly:
grep <something> <<< "$(Lab9 -s 10 15)"
Which is the same as saying:
Lab9 -s 10 15 | grep <something>
So that grep will act on the output of your program. Since I don't know how Lab9 works, let's use a simple example with seq, that returns numbers from 5 to 15:
$ grep 5 <<< "$(seq 5 15)"
5
15
grep is usually used for finding matching lines of a text file. To actually grab a part of the matched line other tools such as awk are used.
Assuming the output looks like "num1 + num2 = 54" (i.e. fields are separated by space), this should do your job:
addanwser=$(Lab9 -a 5 2 | awk '{print $NF}')
echo "$addanwser"
Make sure you don't miss the '$' sign before addanwser when echo'ing it.
$NF selects the last field. You may select nth field using $n.

how do I parse output from ntpq?

I am fairly new to bash scripting... I have an issue with a cronjob where I get too many emails when "ntpq: read: Connection refused" error comes up. I want to create a conditional when this error shows up, DO NOT send the email.
However, I can't seem to parse the output from "nptq -nc peers". I did try to redirect the output of the cronjob to a test.txt file and then create another cronjob that parses that file. However, I feel like there is a better solution.
Thanks for your help!
Here is my code for the cronjob
#!/bin/bash
limit=10101010101010101010000 # Set your limit in milliseconds here
offsets=$(/usr/sbin/ntpq -nc peers | /usr/bin/tail -n +3 | awk 'BEGIN { FS = " " } ; { print $9 }' | /usr/bin/tr -d '-')
for offset in ${offsets}; do
if echo $offset $limit | awk '{exit $1>$2?0:1}'
then
echo "NTPD offset $offset > $limit. Please investigate"
exit 1
fi
done

Resources