Need to print current CPU usage and Memory usage in file continuously - bash

I have prepared the below script, but it's not adding any data to the output file.
My intention is to get the current CPU usage and Memory usage and print them on log file.
What is wrong with my below script? I will run this script file in CentOS machine.
#!/usr/bin/bash
HOSTNAME=$(hostname)
mkdir -p /root/scripts
LOGFILE=/root/scripts/xcpuusagehistory.log
touch $LOGFILE
a=0;
b=1;
while [ "$a" -lt "$b" ]
do
CPULOAD=`top -d10 | grep "Cpu(s)"`
echo "$CPULOAD on Host $HOSTNAME" >> $LOGFILE
done

while true
do
cpu_load="$(top -b -n1 -d10 | grep "Cpu(s)")"
echo "$cpu_load on Host $HOSTNAME" >> "$log_file"
sleep 1
done
See top batch mode (-b) in the man page.

Related

Redirect stderr/-out of running application to my own bash script

I am running an application called "hd-idle". It is spinning down disks after a specific time of inactivity.
The output looks like this:
user#linux:~$ sudo /usr/sbin/hd-idle -i 10800
symlinkPolicy=0, defaultIdle=10800, defaultCommand=scsi, defaultPowerCondition=0, debug=false, logFile=, devices=
sda spindown
sdd spindown
sde spindown
sda spinup
sdd spinup
sdd spindown
[...]
I want to save this output to a logfile (while the application in running), add timestamps and change sd[a-z] to corresponding model/serial of the hard drive.
I wrote a small bash script that does what I want:
user#linux:~$ cat hd_idle_logger.sh
#!/bin/bash
DATUM=$(date '+%Y-%m-%d %H:%M:%S')
INPUT=$(cat)
REGEX='(sd[a-z])\s(spin(down|up))'
[[ $INPUT =~ $REGEX ]]
if [ -n ${BASH_REMATCH[1]} ]
then
MODEL=$(lsblk /dev/${BASH_REMATCH[1]} -n -o MODEL)
SERIAL=$(lsblk /dev/${BASH_REMATCH[1]} -n -o SERIAL)
fi
echo -e "$DATUM\t${MODEL}_$SERIAL (${BASH_REMATCH[1]})\t${BASH_REMATCH[2]}" >> /home/linux/hd_idle_logger.log
I can verify that it works:
user#linux:~$ echo "sdd spindown" |& ./hd_idle_logger.sh
user#linux:~$ cat hd_idle_logger.log
2023-02-12 12:14:54 WDC_WD120EMAZ-10BLFA6_1PAEL2ES (sdd) spindown
But running the application and passing the output to my script doesn't work, the logfile doesn't produce any content and I don't see the output on console anymore:
user#linux:~$ sudo /usr/sbin/hd-idle -i 10800 |& /home/user/hd_idle_logger.sh
So what I am doing wrong?
As long as hd-idle is running, your script will be stuck at INPUT=$(cat). Because $(cat) has to capture ALL output, it can online terminate once hd-idle terminated.
You need a script/program that processes hd-idle's output on the fly; e.g. line by line, while hd-idle is still running. You could do this with a while read loop:
#! /bin/bash
regex='(sd[a-z])\s(spin(down|up))'
while IFS= read -r line; do
[[ $line =~ $regex ]] || continue
model=$(lsblk /dev/"${BASH_REMATCH[1]}" -n -o MODEL)
serial=$(lsblk /dev/"${BASH_REMATCH[1]}" -n -o SERIAL)
printf '%(%Y-%m-%d %H:%M:%S)T\t%s_%s (%s)\t%s\n' \
"$model" "$serial" "${BASH_REMATCH[1]}" "${BASH_REMATCH[2]}"
done >> /home/linux/hd_idle_logger.log
However, it would be more efficient to switch to utils like sed or awk and pre-compute the list of serial numbers or look for the required information in the /sys/block file system, so that you don't have to execute lsblk for each line.

Bash script? Manipulating & graphing collected data from a csv file

I'm trying to write a script that can detect my presence at home. So far I've written a script that outputs data from hcitool lescan into a csv file in the following format:
TIMESTAMP MAC_ADDRESS_1 MAC_ADDRESS_2 AD_INFINITUM
2018-09-22.11:48:34 FF:FF:FF:FF:FF:FF FF:FF:FF:FF:FF:FF FF:FF:FF:FF:FF:FF
I'm trying to figure out how to write a script to convert the data into a graphable format - is gnuplot the program to be used for this? I guess that this would require a bash? script that imports the csv file keeping all timestamps, then adding a new column into the array for each unique MAC address then populating the entries with a 1 or 0 depending if the Mac address is detected per line. Are there any built in commands that can do/help with this or would I have to script it myself?
The code I used to generate the .csv is here. Sorry, its probably not the prettiest as I've just only started with bash scripting.
cd /home/pi/projects/bluetooth_control;
while true
do
echo 'reset hci0';
sudo hciconfig hci0 down;
sudo hciconfig hci0 up;
echo 'timestamp';
echo `date +%Y-%m-%d.%H:%M:%S` &> test1.csv;
echo 'running scan';
(sudo timeout 20 stdbuf -oL hcitool lescan | grep -Eo '(([A-Z]|[0-9]){2}:){5}([A-Z]|[0-9]){2}') &> test.csv;
echo 'removing duplicates to test.csv';
(sort test.csv | uniq) >> test1.csv;
(paste -s test1.csv) >> data.csv;
echo 'sleep for 60s';
sleep 60;
done
I've had time to play around and in the interest of completing the answer here is the solution I came up with. I'm not sure how efficient it is to run it in Bash vs. Python but here goes:
#!/bin/bash
cd /home/pi/projects/bluetooth_control;
while true
do
echo 'reset hci0';
sudo hciconfig hci0 down;
sudo hciconfig hci0 up;
echo 'timestamp';
# Create necessary temp files
echo "temp" &> test1.csv;
echo `date +%Y-%m-%d.%H:%M:%S` &> test2.csv;
echo 'running scan';
# Filter out only MAC addresses
(sudo timeout 20 stdbuf -oL hcitool lescan | grep -Eo '(([A-Z]|[0-9]){2}:){5}([A-Z]|[0-9]){2}') &> /home/pi/projects/bluetooth_control/test.csv;
echo 'removing duplicates to test.csv';
# Append each unique value to test1.csv
(sort test.csv | uniq) >> test1.csv;
# For each line in test1.csv, add text to mac_database if it doesn't exist
while read line
do
grep -q -F $line mac_database || echo $line >> mac_database
done <test1.csv
# For each line in mac_database, run an if loop
while read line
do
# If $line from mac_database exists in test1.csv, then
if grep -Fxq "$line" test1.csv
then
echo '1' >> test2.csv
else
echo '0' >> test2.csv
fi
done <mac_database
# Convert file to csv format, and append to data.csv
(paste -s test2.csv) >> data.csv;
echo 'sleep for 60s';
sleep 60;
done
Hopefully this helps whoever might choose to do this in the future.

SCP loop stops executing after some time

So I have these two versions of the same script. Both are attempting to copy my profile to all the servers on my infra ( about 5k ). The problem I am having is that no matter which version I use, I always get the process stuck somewhere around 300 servers. It does not matter if I do it sequentially or in parallel, both version fail and both at a random server. I dont get any error message (Yes I know Im redirecting error messages to null now), it simply stops executing after reaching a random point close to 300 servers and it just lingers there doing nothing.
The best run I could get did it for about 357 servers.
Probably there is some detail I unknow that is causing this. Could someone advise?
Sequential
#!/bin/bash
clear
echo "$(date) - Process started"
all_count="$( cat all_servers.txt | wc -l )"
while read server
do
scp -B -o "StrictHostKeyChecking no" ./.bash_profile rouser#${server}:/home/rosuer/ && echo "$server - Done!" >> ./log.log || echo "$server - Failed!" >> ./log.log
done <<< "$( cat all_servers.txt )"
echo "$(date) - Process completed!!"
Parallel
#!/bin/bash
clear
echo "$(date) - Process started"
all_count="$( cat all_servers.txt | wc -l )"
while read server
do
scp -B -o "StrictHostKeyChecking no" ./.bash_profile rouser#${server}:/home/rosuer/ && echo "$server - Done!" >> ./log.log || echo "$server - Failed!" >> ./log.log &
done <<< "$( cat all_servers.txt )"
wait
echo "$(date) - Process completed!!"
Let's start with better input parsing. Instead of parsing a bash herestring from a posix command substitution via a while read loop, I've got the while read loop running through your server list directly via pipeline (this assumes one server per line in that file. I can fix this if that's not the case). If the contents of all_servers.txt was too long for a command line, you'd experience an error and/or premature termination.
I've also removed extraneous ./ items and I assume that rouser's home directory on each server is in fact /home/rouser (scp defaults to the home directory if given a relative path or no path at all).
Sequential
#!/bin/bash
clear
echo "$(date) - Process started"
all_count="$( cat all_servers.txt | wc -l )"
while read server
do
scp -B -o "StrictHostKeyChecking no" .bash_profile rouser#${server}: \
&& echo "$server - Done!" >> log.log \
|| echo "$server - Failed!" >> log.log
done < all_servers.txt
echo "$(date) - Process completed!!"
Parallel
For the Parallel solution, I've enclosed your conditional in parentheses just in case the pipeline was backgrounding the wrong process.
#!/bin/bash
clear
echo "$(date) - Process started"
all_count="$( cat all_servers.txt | wc -l )"
while read server
do
(
scp -B -o "StrictHostKeyChecking no" .bash_profile rouser#${server}: \
&& echo "$server - Done!" >> log.log
|| echo "$server - Failed!" >> log.log
) &
done < all_servers.txt
wait
echo "$(date) - Process completed!!"
SSH keys
I highly recommend learning more about SSH. The scp -B flag was unknown to me because I'm used to using SSH keys and ssh-agent, which will make such connectivity seamless (use passwordless keys if you're running this in a cron job).

Convert bash script to Windows

I wrote the following script for Linux in order to detect drops in my network connection:
#!/bin/bash
echo "### RUNNING ###"
echo "### $(date) ###"
while true;do
now=$(date +"%T")
if [[ "$(ping -c 1 8.8.8.8 | grep '100.0% packet loss' )" != "" ]]; then
echo "!!! KO ($now)" >> "log_connectivity_$(date +"%F")"
else
echo "OK ($now)" >> "log_connectivity_$(date +"%F")"
fi
sleep 5s
done
What it does is, within a loop, to ping 8.8.8.8 once and, if packet is lost it prints KO and the time and, otherwise, it prints OK and the time.
I would like to translate this bash script into a Windows script, but I have no idea. I would be very grateful if you could help me with this.
Thanks in advance ;)

Commands won't write to file when script is executed by cron

I used crontab -e to schedule the execution of a shell script that does ssh calls to a list of servers and gets information and prints to file. The output of crontab -l is:
SHELL = /bin/sh
PATH = /usr/local/sbin:/usr/local/bin:/sbin:/bin:/usr/sbin:/usr/bin
* 1 * * 1,2,3,4,5 /bin/bash /Users/cjones/Documents/Development/Scripts/DailyStatus.sh
The script I am running logs to files the output of echo "Beginning remote connections..." >> $logfile however does not log to a file the output of the following loop:
for servers in $(cat hostnames.txt); do
echo "Starting connection to $servers" >> $logfile
(rsync -av /Users/cjones/Documents/Development/Scripts/checkup.sh cjones#$servers:~/checkup.sh > /dev/null
echo""
ssh -t $servers "sudo ./checkup.sh") >> $logfile
echo ""
done
Pastebin of the full script: http://pastebin.com/3vD7Bba0
Additional note this script pushes the latest version of a management script then ssh's into the remote server to execute and capture the ouput. This work 100% of the time when ran manually. Any assitance would be helpful thanks!
you need to do SHELL=/bin/sh and the same with PATH. The spaces around = are wrong.
Also, use full paths when calling files in your script when you call it with crontab:
From
for servers in $(cat hostnames.txt); do
echo "Starting connection to $servers" >> $logfile
(rsync -av /Users/cjones/Documents/Development/Scripts/checkup.sh cjones#$servers:~/checkup.sh > /dev/null
echo""
ssh -t $servers "sudo ./checkup.sh") >> $logfile
echo ""
done
to
while read $servers
do
echo "Starting connection to $servers" >> $logfile
(rsync -av /Users/cjones/Documents/Development/Scripts/checkup.sh cjones#$servers:~/checkup.sh > /dev/null
echo""
ssh -t $servers "sudo ./checkup.sh") >> $logfile
echo ""
done < /path/to/hostnames.txt
^^^^^^^^^
Note the usage of while read; do ... done < file instead of the unnecessary for host in $(cat ...).

Resources