I want to use wget in sh script but I don't want to download wget url.How can I do this ? This script is getting load average with this code uptime | awk -F'[a-z]:' '{ print $2}' and I'll pass this values to php script with wget.
If you want to pipe the document instead of downloading it to a file, use the -O option:
wget -O - URL | command
Redirecting to the filename - means to send to standard output instead of a file.
I am having trouble understanding your question, but I will attempt to rephrase it and suggest a solution.
My guess is that you want to show the load averages of a remote server on a webpage, via php. With that assumption, let me show you an easier way to do that. This way requires that you have access via ssh to the remote computer, and that your local computer can access the remote computer with a ssh key.
Basically, you will use ssh to execute a command on the remote machine, then save the output (load averages) locally (somewhere that your web server can access them). Then you will include the load average file in your php script.
First, you need to get the load average of the remote computer and save it locally. To do so, run this command:
ssh [remote username]#[address of remote computer] "uptime" | awk -F'[a-z]:' '{ print $2}' > [path to where you want to save the load average]
Here is an example:
ssh jake#10.0.0.147 "uptime" | awk -F'[a-z]:' '{ print $2}' > /var/www/load_average.txt
Next, you need to setup your php script, it will looking something like this:
<?php
include "load_average.txt";
>
You should also setup a cronjob to request the information regularly so that it is up to date.
Related
Currently I am working with a local machine that does not have finger command built in and we do not have permission to install it either. However, there is a remote server that has it installed and can be used that way. I am using finger command to get First and Last name of the users. Here is the code below in bash:
#!/usr/bin/env bash
NAMES=("ssmith" "jnicol" "ahumph" "nkidma" "bbanne")
for name in ${NAMES[#]}; do
theName=`ssh -qX 123.45.67.89 finger $name | awk 'NR==1{if($7!="???") print $7, $8}'`
arr+=("$theName") #Appending name returned from command to global array
done
The above code works but it is super slow. Is there any simpler way to ssh over to remote server to run command and get list of all user(s) first and last name in single attempt, and then append all of those into an array like shown above? There are 100s of users in the system and doing ssh over to remote server for every single one of them is not going to be optimal.
Any help would be appreciated.
All, found the answer to this. I could do the following in my Local machine and still get user's first and last name.
FULLNAME=$(getent passwd $USER | cut -d : -f 5)
Thanks all.
I'm currently using the ad-hoc command ansible ubuntu -a "ls -l /var/run/reboot-required" to get a list of servers that require reboot. However, the end result is a list of all servers, and either the info about the indicated file or an error that the file does not exist.
I'm familiar enough with playbooks to create one that actually does the reboot, but I don't want that. I just want a nice (and relatively neat) list of servers that still require a reboot.
A more generic solution of getting a list of servers that meet some criteria (e.g. have a variable set) would also be quite helpful.
Not easy because the proper way is checking the existence of the file with stat, saving it to a variable and create a list when: var.stat.exists.
If you want to do in one line and you don't mind using bash scripting, do:
ansible ubuntu -m stat -a "path=/var/run/reboot-required" -o | grep -v '{"exists": false}' | awk -F\| '{ print $1 }'
Hope it helps
I have a process (which I have put into an alias in .bash_profile) to get a log file from my remote ssh server, save it to my local machine and then empty the remote file.
At the moment, my process has these two commands:
scp admin#remote.co.za:public/proj/current/log/exceptions.log "exceptions $(date +"%d %b %Y").log"
to download the file to my local machine, and then
ssh admin#remote.co.za "> /public/proj/current/log/exceptions.log"
to clear the remote file. Doing it this way means that I'm logging in via ssh twice. I want this to be effecient as possible, so I want a way to only login once, do both operations, and then logout.
So if I can find a way to send the file to my local machine from the command-line of the server, I can do this:
ssh admin#remote.co.za "[GET FILE]; > /public/proj/current/log/exceptions.log"
Is there a way to do this? And if not, is there any other way to do achieve my goal while logging in once only?
ssh admin#remote.co.za "cat /public/proj/current/log/exceptions.log &&
> /public/proj/current/log/exceptions.log" > "exceptions $(date +"%d %b %Y").log"
This works by catting the entire file to stdout which will flow through as the stdout of ssh, then truncating the file remotely (assuming cat succeeded).
I have a bash script that automatically connects to a remote location and runs another script that is stored at the remote location. In the script on the remote location, I want to have my autoconnect script (which is stored on my local pc) capture specific output that is echoed out from the remote script so I can store it in a separate log. I know that something will need to be placed on the remote script that redirects the output so the local script can catch it, I'm just not sure where to go. Help is appreciated!
On your local script, in your ssh line, you can redirect some of the outputs to a file with tee:
ssh ... | tee -a output.log
If you want to filter which one goes to the output.log file, you can use process substitution:
ssh .... | tee >(grep "Some things you want to filter." >> output.log)
Besides grep you can use other commands as well like awk.
I've created a really long script which fully automates installation and configuration of a web server in my company. During the script runtime it accesses some remote server using scp and ssh in order to download configuration files and I'd like to be able to have a secret file which holds the password (it's always the same password) and that the script will use this file without the need that i'll insert it manually. some lines from the script look like this:
/usr/bin/scp root#192.168.1.10:/etc/mail/sendmail.cf /etc/mail/
/usr/bin/scp -r root#192.168.1.10:/etc/yum.repos.d /etc/
/usr/bin/ssh root#192.168.1.10 'rpm -qa --queryformat "%{NAME}\n" >/tmp/sw.lst'
/usr/bin/scp root#192.168.1.10:/tmp/sw.lst /tmp/
/usr/bin/xargs yum -y install < /tmp/sw.lst
I know about the method of #ssh-keygen and #ssh-copy-id but the problem is that the script will run every time on a different machine and I don't want to exchange the keys before each run.
When i had to do something like that, i used expect and a wrapper script that would fetch a password from a file.
I.e. in my password file i'd have something like
root#192.168.1.10 ThisIsMyPass
user#localhost thisIsMyOtherPass
and then have the wrapper script get (it could be simple as grep "root#192.168.1.10" ~/.passwords | cut -d ' ' -f2)
Im guessing there are more appropriate methods, but with this one you only need to keep your wrapper and password file protected and you can make your setup script public.