Connecting Two Bash Commands - bash

I have Ubuntu Linux. I found one command will let me download unread message subjects from Gmail:
curl -u USERNAME:PASSWORD --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | sed -n "s/<title>\(.*\)<\/title.*name>\(.*\)<\/name>.*/\2 - \1/p"
...and then another command to let me send mail easily (once I installed the sendemail command via apt-get):
sendEmail -f EMAIL#DOMAIN.COM -v -t PHONE#SMS.COM -u Gmail Notifier -m test -s MAILSERVER:PORT -xu EMAIL#DOMAIN.COM -xp PASSWORD
(Note when in production I'll probably swap -v above with -q.)
So, if one command downloads one line subjects, how can I pipe these into the sendEmail command?
For instance, I tried using a pipe character between the two, where I used "$1" after the -m parameter, but what happened was that when I had no unread emails it would still send me at least one empty message.
If you help me with this, I'll use this information to share on StackOverflow how to build a Gmail Notifier that one can hook up to SMS messages on their phone.

I think if you mix viraptor & DigitalRoss' answers you get what you want. I created a sample test by creating a fake file with the following input:
File contents:
foo
bar
baz
Then I ran this command:
% cat ~/tmp/baz | while read x; do if [[ $x != "" ]]; then echo "x: '$x'"; fi; done
This will only print lines with input out. I'm not familiar with sendEmail; does it need the body to be on stdin or can you pass it on the cmdline?

You do know you can do that directly in gmail by using a filter and your SMS email gateway, right?
But back to the question...
You can get control in a shell script for command output with the following design pattern:
command1 | while read a b c restofline; do
: execute commands here
: command2
done
Read puts the first word in a, the second in b, and the rest of the line in restofline. If the loop consists of only a single command, the xargs program will probably just do what you want. Read in particular about the -I parameter which allows you to place the substituted argument anywhere in the command.
Sometimes the loop looks like ... | while read x; do, which puts the entire line into x.

Try this structure:
while read line
do
sendemailcommand ... -m $line ...
done < <(curlcommand)

I'd look at the xargs command, which provides all the features you need (as far as I can tell).
http://unixhelp.ed.ac.uk/CGI/man-cgi?xargs

Maybe something like this:
curl_command > some_file
if [[ `wc -l some_file` != "0 some_file" ]] ; then
email_command < some_file
fi

Related

"WRITE" command works manually but not via script

My Co-Workers and I use the screen program on our Linux JUMP server to utilize as much screen space as possible. With that, we have multiple screens setup so that messages can go to one while we do work in another.
With that, i have a script that is used to verify network device connectivity which will send messages to my co-workers regardless if there is anything down or not.
The script initially references a file with their usernames in it and then grabs the highest PTS number which denotes the last screen session they activated and then puts it into the proper format in an external file like such:
cat ./netops_techs | while read -r line; do
temp=$(echo $line)
temp2=$(who | grep $temp | sed 's/[^0-9]*//g' | sort -n -r | head -n1)
if who | grep $temp; then
echo "$temp pts/$temp2" >> ./tech_send
fi
done
Once it is done, it will then scan our network every 5 minutes and send updates to the folks in the file "./tech_send" like such:
Techs=$(cat ./tech_send)
if [ ! -f ./Failed.log ]; then
echo -e "\nNo network devices down at this time."
for d in $Techs
do
cat ./no-down | write $d
done
else
# Writes downed buildings localy to my terminal
echo -e "\nThe following devices are currently down:"
echo ""
echo "IP Hostname Model Building Room Rack Users Affected" > temp_down.log
grep -f <(sed 's/.*/\^&\\>/' Failed.log) Asset-Location >> temp_down.log
cat temp_down.log | column -t > Down.log
cat Down.log
# This will send the downed buildings to the rest of NetOps
for d in $Techs
do
cat Down.log | write $d
done
fi
The issue stems from, when they are working in their main sectioned screen, the messages will pop up in that active screen instead of the inactive screen. If I send them a message manually such as:
write jsmith pts/25
Test Test
and then CTRL+D, it works fine even if they are in a different session. Via script though, it gives an error stating that:
write: jsmith is logged in more than once; writing to pts/23
write: jsmith/pts/25 is not logged in
I have verified the "tech_send" file and it has the correct format for them:
jsmith pts/25
Would appreciate any insight on why this is happening.

Writing a Bash script that takes a text file as input and pipes the text file through several commands

I keep text files with definitions in a folder. I like to convert them to spoken word so I can listen to them. I already do this manually by running a few commands to insert some pre-processing codes into the text files and then convert the text to spoken word like so:
sed 's/\..*$/[[slnc 2000]]/' input.txt inserts a control code after first period
sed 's/$/[[slnc 2000]]/' input.txt" inserts a control code at end of each line
cat input.txt | say -v Alex -o input.aiff
Instead of having to retype these each time, I would like to create a Bash script that pipes the output of these commands to the final product. I want to call the script with the script name, followed by an input file argument for the text file. I want to preserve the original text file so that if I open it again, none of the control codes are actually inserted, as the only purpose of the control codes is to insert pauses in the audio file.
I've tried writing
#!/bin/bash
FILE=$1
sed 's/$/ [[slnc 2000]]/' FILE -o FILE
But I get hung up immediately as it says sed: -o: No such file or directory. Can anyone help out?
If you just want to use foo.txt to generate foo.aiff with control characters, you can do:
#!/bin/sh
for file; do
test "${file%.txt}" = "${file}" && continue
sed -e 's/\..*$/[[slnc 2000]]/' "$file" |
sed -e 's/$/[[slnc 2000]]/' |
say -v Alex -o "${file%.txt}".aiff
done
Call the script with your .txt files as arguments (eg, ./myscript *.txt) and it will generate the .aiff files. Be warned, if say overwrites files, then this will as well. You don't really need two sed invocations, and the sed that you're calling can be cleaned up, but I don't want to distract from the core issue here, so I'm leaving that as you have it.
This will:-
a} Make a list of your text files to process in the current directory, with find.
b} Apply your sed commands to each text file in the list, but only for the current use, allowing you to preserve them intact.
c} Call "say" with the edited files.
I don't have say, so I can't test that or the control codes; but as long as you have Ed, the loop works. I've used it many times. I learned it as a result of exposure to FORTH, which is a language that still permits unterminated loops. I used to have problems with remembering to invoke next at the end of the script in order to start it, but I got over that by defining my words (functions) first, in FORTH style, and then always placing my single-use commands at the end.
#!/bin/sh
next() {
[[ -s stack ]] && main
end
}
main() {
line=$(ed -s stack < edprint+.txt)
infile=$(cat "${line}" | sed 's/\..*$/[[slnc 2000]]/' | sed 's/$/[[slnc 2000]]/')
say "${infile}" -v Alex -o input.aiff
ed -s stack < edpop+.txt
next
}
end() {
rm -v ./stack
rm -v ./edprint+.txt
rm -v ./edpop+.txt
exit 0
}
find *.txt -type -f > stack
cat >> edprint+.txt << EOF
1
q
EOF
cat >> edpop+.txt << EOF
1d
wq
EOF
next

Shell script to fetch a value from a file and save it to other

I have some text in one file which I want to be copied to another file, using shell script.
This is the script -
#!/bin/sh
PROPERTY_FILE=/path/keyValuePairs.properties
function getValue {
FIELD_KEY=$1
FIELD_VALUE=`cat $PROPERTY_FILE | grep "$FIELD_KEY" | cut --complement -d'=' -f1`
}
SERVER_FILE=/path/FileToReplace.yaml
getValue "xyz.abc"
sed -i -e "s|PASSWORD|$FIELD_VALUE|g" $SERVER_FILE
keyValuePairs.properties:
xyz.abc=abs
FileToReplace.yaml:
someField:
address: "someValue"
password: PASSWORD
The goal of the script is to fetch "abs" from keyValuePairs.properties and replace it in FileToReplace.yaml from PASSWORD field.
The FileToReplace.yaml should look like
someField:
address: "someValue"
password: abs
Note - Instead of "abs", there could be '=' in the text. It should work fine too.
The current situation is that when I run the script, it updates FileToReplace.yaml as
someField:
address: "someValue"
password:
It is setting the value as empty.
Can someone please help me figure what's wrong with this script?
Note - Whenever I execute the script, I get the issue -
sh scriptToRun.sh
cut: illegal option -- -
usage: cut -b list [-n] [file ...]
cut -c list [file ...]
cut -f list [-s] [-d delim] [file ...]
If I use gcut, the code just works fine, but I can't use gcut (requirement issues). I need to fix this using cut.
There are a few issues with your script:
FIELD_VALUE is local to the getValue() function.
getValue() will match rows containing FIELD_KEY anywhere in the line (e.g. some.property=string.containing.xyz.abc)
getValue() could return multiple rows.
All occurrences of the string "PASSWORD" in the server file will be updated, not just the ones on the "password: PASSWORD" line.
If you can use bash instead of sh, this should resolve all of the issues:
#!/bin/bash
declare property_file=/path/keyValuePairs.properties
declare server_file=/path/FileToReplace.yaml
declare property="xyz.abc"
property_line=$(grep -m 1 "^${property}=" ${property_file}" )
sed -i 's|^\(\s*password:\s*\)PASSWORD\s$|\1'${property_line##*=}'|g' ${server_file}
The original code which I posted, worked. I was using the wrong name of the file in the shell (in my real code) which was causing it to not read the value and hence setting it to empty.
Replace the cut command with:
cut -d'=' -f2-
and it should work on all versions of cut.
-f2- means field 2 and all later. This is necessary to handle values containing '='s.
And yes, some characters will cause problems for the sed command. It's hard to get a robust solution without getting into trouble here. A python script may be the better choice.
If shell script is the only option, you could try something like this:
(sed -n -e '1,/PASSWORD/p' FileToReplace.yaml | head -n -1;
echo " password: ${FIELD_VALUE}";
sed -n -e '/PASSWORD/,$ p' FileToReplace.yaml) > FileToReplace.yaml.new \
&& mv FileToReplace.yaml.new FileToReplace.yaml
but it gets quite ugly. (print the file up to the line containing "PASSWORD", then echo the full password line, then print the rest of the file)
You can also use something like this:
cat << EOF > FileToCreate.yaml
someField:
address: "someValue"
password: ${FIELD_VALUE}
if keeping the old contents of the file is not important.

Lining up pipeline results alongside input (here, "ip" and whois grep results)

I need to perform a whois lookup on a file containing IP addresses and output both the country code and the IP address into a new file. In my command so far I find the IP addresses and get a unique copy that doesn't match allowed ranges. Then I run a whois lookup to find out who the foreign addresses are. Finally it pulls the country code out. This works great, but I can't get it show me the IP alongside the country code since that isn't included in the whois output.
What would be the best way to include the IP address in the output?
awk '{match($0,/[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+/); ip = substr($0,RSTART,RLENGTH); print ip}' myInputFile \
| sort \
| uniq \
| grep -v '66.33\|66.128\|75.102\|216.106\|66.6' \
| awk -F: '{ print "whois " $1 }' \
| bash \
| grep 'country:' \
>> myOutputFile
I had thought about using tee, but am having troubles lining up the data in a way that makes sense. The output file should be have both the IP Address and the country code. It doesn't matter if they are a single or double column.
Here is some sample input:
Dec 27 04:03:30 smtpfive sendmail[14851]: tBRA3HAx014842: to=, delay=00:00:12, xdelay=00:00:01, mailer=esmtp, pri=1681345, relay=redcondor.itctel.c
om. [75.102.160.236], dsn=4.3.0, stat=Deferred: 451 Recipient limit exceeded for this se
nder
Dec 27 04:03:30 smtpfive sendmail[14851]: tBRA3HAx014842: to=, delay=00:00:12, xdelay=00:00:01, mailer=esmtp, pri=1681345, relay=redcondor.itctel.c
om. [75.102.160.236], dsn=4.3.0, stat=Deferred: 451 Recipient limit exceeded for this se
nder
Thanks.
In general: Iterate over your inputs as shell variables; this then lets you print them alongside each output from the shell.
The below will work with bash 4.0 or newer (requires associative arrays):
#!/bin/bash
# ^^^^- must not be /bin/sh, since this uses bash-only features
# read things that look vaguely like IP addresses into associative array keys
declare -A addrs=( )
while IFS= read -r ip; do
case $ip in 66.33.*|66.128.*|75.102.*|216.106.*|66.6.*) continue;; esac
addrs[$ip]=1
done < <(grep -E -o '[0-9]+[.][0-9]+[.][0-9]+[.][0-9]+')
# getting country code from whois for each, printing after the ip itself
for ip in "${!addrs[#]}"; do
country_line=$(whois "$ip" | grep -i 'country:')
printf '%s\n' "$ip $country_line"
done
An alternate version which will work with older (3.x) releases of bash, using sort -u to generate unique values rather than doing that internal to the shell:
while read -r ip; do
case $ip in 66.33.*|66.128.*|75.102.*|216.106.*|66.6.*) continue;; esac
printf '%s\n' "$ip $(whois "$ip" | grep -i 'country:')"
done < <(grep -E -o '[0-9]+[.][0-9]+[.][0-9]+[.][0-9]+' | sort -u)
It's more efficient to perform input and output redirection for the script as a whole than to put a >> redirection after the printf itself (which would open the file before each print operation and close it again after, incurring a substantial performance penalty), which is why suggested invocation for this script looks something like:
countries_for_addresses </path/to/logfile >/path/to/output

Difficulty echoing from a live MQTT feed

I am unable to see what I receive through the MQTT/mosquitto stream by means of echoing.
My code is as follows:
#!/bin/bash
`mosquitto_sub -d -t +/# >>mqtt_log.csv`
mqtt_stream_variable=`sed '$!d' mqtt_log.csv`
echo "$mqtt_stream_variable"
First line subscribes to the mqtt stream and appends the output to the mqtt_log.csv file. Then I sed '$!d' mqtt_log.csv so I get the last lines value assigned to the mqtt_stream variable, I later echo this.
When I execute this - I don't see any echoing I was curious to know how I could do this? When I cat mqtt_log.csv there are things in there. So the mosquitto_sub -d -t +/# >>mqtt_log.csv part is working. It's just the echoing that is being problematic.
Ideally after mqtt_stream=``sed '$!d' mqtt_log.csv I would like to play around with the values in mqtt_log.csv [as it's a csv string]. So by means of echoing I can see what the mqtt_stream_variable variable holds
The mosquitto_sub command will never return and sed will read the empty file before any messages are written to it and then exit.
How about something like this
#!/bin/bash
mosquitto_sub -d -t +/# | tee -a mqtt_log.csv | sed '$!d'
No need for all the sub shells and pipes will get you what you want.
The only other thing is why the need for both wild cards in the topic? +/# should be the same as just # (you will probably need to wrap the # in quotes on it's own)

Resources