How to get Primary MX IP from domain in Bash - bash

I am trying to write a bash script that will from a domain name find it MX records, from them figure out which is the primary (they are not always in order) and then find its IP.
(when there are more then one primary MX the first one found would be ok)
For example:
./findmxip.sh gmail.com
Would give me 173.194.71.26. For me to do this I need to host gmail.com
then find the primary MX in the results and host it, getting its IP.

To get exactly 0 or 1 answers:
dig +short gmail.com mx | sort -n | nawk '{print $2; exit}' | dig +short -f -
You'll need a non-ancient dig that supports +short.
As noted there may be more than one "primary" MX as the preferences need not be unique. If you want all the IP addresses of all of the lowest preference records then:
dig +short oracle.com mx | sort -n |
nawk -v pref=65536 '($1<=pref) {pref=$1; print $2}' |
dig +short -f - | uniq
This does not handle the case where there is no MX record and the A record accepts email, an uncommon but perfectly valid configuration.
Sadly all the dig versions I've tested return 0 whether the domain exists or not (NXDOMAIN), and whether any MX records exist or not. You can catch a DNS time-out (rc=9) though. The related host command does return a non-zero rc with NXDOMAIN, but its behaviour is a little inconsistent, it's messy to script and the output harder to parse.
A poor man's error-checking version (inspired by tripleee's comment) that might be slightly more robust depending on your host command is:
DOMAIN=gmail.com
if ! host -t any $DOMAIN >/dev/null 2>&1 ; then
echo "no such domain"
elif ! host -t mx $DOMAIN >/dev/null 2>&1; then
echo "no MX records"
else
dig +short $DOMAIN mx | sort -n | nawk '{print $2; exit}' | dig +short -f -
fi
(Perversely, you may require an older version of host (bind-8.x) for the -t mx test to work, newer versions just return 0 instead.)
This is just about the point people start backing away nervously asking why you're not using perl/python/$MFTL.
If you really need to write a robust version in bash, check out the djbdns CLI tools and debugging tools which are rather easier to parse (though sadly don't set user exit codes either).

Related

Check IP address off a list using ksh/bash

I have a list (text file) with the following data:
app1 example1.google.com
app2 example2.google.com
dev1 device1.google.com
cell1 iphone1.google.com
I want to check the ip address of the URLs/hostnames and update the text file with the gathered ip. Example:
app1 example1.google.com 192.168.1.10
app2 example2.google.com 192.168.1.55
dev1 device1.google.com 192.168.1.53
cell1 iphone1.google.com 192.168.1.199
You can use dig to get the IP (but the domains must exist). Not tested for IPv6.
#! /bin/bash
while read name url ; do
ip=$(dig -4 $url | grep '^[^;]' | grep -o '\([0-9]*[.:]\)\+[0-9.:]*$')
printf '%s %s %s\n' "$name" "$url" "$ip"
done < data.txt
If there are only two columns in the file, this might help:
awk '{"dig +short " $2 | getline ip ; print $1, $2, ip}' file
First we run a subshell (not really a good idea to run this for zillions of records) and in it a "dig +short" (the shortest possibility which came to my mind to get IP address only) with the FQDN of a machine (found in the second column). Then we print all the original columns and the new one too (with the IP address). Output can be redirected to a new file with a single >. I wouldn't consider "safe" to edit original files.

Lining up pipeline results alongside input (here, "ip" and whois grep results)

I need to perform a whois lookup on a file containing IP addresses and output both the country code and the IP address into a new file. In my command so far I find the IP addresses and get a unique copy that doesn't match allowed ranges. Then I run a whois lookup to find out who the foreign addresses are. Finally it pulls the country code out. This works great, but I can't get it show me the IP alongside the country code since that isn't included in the whois output.
What would be the best way to include the IP address in the output?
awk '{match($0,/[0-9]+\.[0-9]+\.[0-9]+\.[0-9]+/); ip = substr($0,RSTART,RLENGTH); print ip}' myInputFile \
| sort \
| uniq \
| grep -v '66.33\|66.128\|75.102\|216.106\|66.6' \
| awk -F: '{ print "whois " $1 }' \
| bash \
| grep 'country:' \
>> myOutputFile
I had thought about using tee, but am having troubles lining up the data in a way that makes sense. The output file should be have both the IP Address and the country code. It doesn't matter if they are a single or double column.
Here is some sample input:
Dec 27 04:03:30 smtpfive sendmail[14851]: tBRA3HAx014842: to=, delay=00:00:12, xdelay=00:00:01, mailer=esmtp, pri=1681345, relay=redcondor.itctel.c
om. [75.102.160.236], dsn=4.3.0, stat=Deferred: 451 Recipient limit exceeded for this se
nder
Dec 27 04:03:30 smtpfive sendmail[14851]: tBRA3HAx014842: to=, delay=00:00:12, xdelay=00:00:01, mailer=esmtp, pri=1681345, relay=redcondor.itctel.c
om. [75.102.160.236], dsn=4.3.0, stat=Deferred: 451 Recipient limit exceeded for this se
nder
Thanks.
In general: Iterate over your inputs as shell variables; this then lets you print them alongside each output from the shell.
The below will work with bash 4.0 or newer (requires associative arrays):
#!/bin/bash
# ^^^^- must not be /bin/sh, since this uses bash-only features
# read things that look vaguely like IP addresses into associative array keys
declare -A addrs=( )
while IFS= read -r ip; do
case $ip in 66.33.*|66.128.*|75.102.*|216.106.*|66.6.*) continue;; esac
addrs[$ip]=1
done < <(grep -E -o '[0-9]+[.][0-9]+[.][0-9]+[.][0-9]+')
# getting country code from whois for each, printing after the ip itself
for ip in "${!addrs[#]}"; do
country_line=$(whois "$ip" | grep -i 'country:')
printf '%s\n' "$ip $country_line"
done
An alternate version which will work with older (3.x) releases of bash, using sort -u to generate unique values rather than doing that internal to the shell:
while read -r ip; do
case $ip in 66.33.*|66.128.*|75.102.*|216.106.*|66.6.*) continue;; esac
printf '%s\n' "$ip $(whois "$ip" | grep -i 'country:')"
done < <(grep -E -o '[0-9]+[.][0-9]+[.][0-9]+[.][0-9]+' | sort -u)
It's more efficient to perform input and output redirection for the script as a whole than to put a >> redirection after the printf itself (which would open the file before each print operation and close it again after, incurring a substantial performance penalty), which is why suggested invocation for this script looks something like:
countries_for_addresses </path/to/logfile >/path/to/output

Portable way to resolve host name to IP address

I need to resolve a host name to an IP address in a shell script. The code must work at least in Cygwin, Ubuntu and OpenWrt(busybox).
It can be assumed that each host will have only one IP address.
Example:
input
google.com
output
216.58.209.46
EDIT:
nslookup may seem like a good solution, but its output is quite unpredictable and difficult to filter. Here is the result command on my computer (Cygwin):
>nslookup google.com
Unauthorized answer:
Serwer: UnKnown
Address: fdc9:d7b9:6c62::1
Name: google.com
Addresses: 2a00:1450:401b:800::200e
216.58.209.78
I've no experience with OpenWRT or Busybox but the following one-liner will should work with a base installation of Cygwin or Ubuntu:
ipaddress=$(LC_ALL=C nslookup $host 2>/dev/null | sed -nr '/Name/,+1s|Address(es)?: *||p')
The above works with both the Ubuntu and Windows version of nslookup. However, it only works when the DNS server replies with one IP (v4 or v6) address; if more than one address is returned the first one will be used.
Explanation
LC_ALL=C nslookup sets the LC_ALL environment variable when running the nslookup command so that the command ignores the current system locale and print its output in the command’s default language (English).
The 2>/dev/null avoids having warnings from the Windows version of nslookup about non-authoritative servers being printed.
The sed command looks for the line containing Name and then prints the following line after stripping the phrase Addresses: when there's more than one IP (v4 or 6) address -- or Address: when only one address is returned by the name server.
The -n option means lines aren't printed unless there's a p commandwhile the-r` option means extended regular expressions are used (GNU sed is the default for Cygwin and Ubuntu).
If you want something available out-of-the-box on almost any modern UNIX, use Python:
pylookup() {
python -c 'import socket, sys; print socket.gethostbyname(sys.argv[1])' "$#" 2>/dev/null
}
address=$(pylookup google.com)
With respect to special-purpose tools, dig is far easier to work with than nslookup, and its short mode emits only literal answers -- in this case, IP addresses. To take only the first address, if more than one is found:
# this is a bash-specific idiom
read -r address < <(dig +short google.com | grep -E '^[0-9.]+$')
If you need to work with POSIX sh, or broken versions of bash (such as Git Bash, built with mingw, where process substitution doesn't work), then you might instead use:
address=$(dig +short google.com | grep -E '^[0-9.]+$' | head -n 1)
dig is available for cygwin in the bind-utils package; as bind is most widely used DNS server on UNIX, bind-utils (built from the same codebase) is available for almost all Unix-family operating systems as well.
Here's my variation that steals from earlier answers:
nslookup blueboard 2> /dev/null | awk '/Address/{a=$3}END{print a}'
This depends on nslookup returning matching lines that look like:
Address 1: 192.168.1.100 blueboard
...and only returns the last address.
Caveats: this doesn't handle non-matching hostnames at all.
TL;DR; Option 2 is my preferred choice for IPv4 address. Adjust the regex to get IPv6 and/or awk to get both. There is a slight edit to option 2 suggested use given in EDIT
Well a terribly late answer here, but I think I'll share my solution here, esp. because the accepted answer didn't work for me on openWRT(no python with minimal setup) and the other answer errors out "no address found after comma".
Option 1 (gives the last address from last entry sent by nameserver):
nslookup example.com 2>/dev/null | tail -2 | tail -1 | awk '{print $3}'
Pretty simple and straight forward and doesn't really need an explanation of piped commands.
Although, in my tests this always gave IPv4 address (because IPv4 was always last line, at least in my tests.) However, I read about the unexpected behavior of nslookup. So, I had to find a way to make sure I get IPv4 even if the order was reversed - thanks regex
Option 2 (makes sure you get IPv4):
nslookup example.com 2>/dev/null | sed 's/[^0-9. ]//g' | tail -n 1 | awk -F " " '{print $2}'
Explanation:
nslookup example.com 2>/dev/null - look up given host and ignore STDERR (2>/dev/null)
sed 's/[^0-9. ]//g' - regex to get IPv4 (numbers and dots, read about 's' command here)
tail -n 1 - get last 1 line (alt, tail -1)
awk -F " " '{print $2} - Captures and prints the second part of line using " " as a field separator
EDIT: A slight modification based on a comment to make it actually more generalized:
nslookup example.com 2>/dev/null | printf "%s" "$(sed 's/[^0-9. ]//g')" | tail -n 1 | printf "%s" "$(awk -F " " '{print $1}')"
In the above edit, I'm using printf command substitution to take care of any unwanted trailing newlines.

bash script to perform dig -x

Good day. I was reading another post regarding resolving hostnames to IPs and only using the first IP in the list.
I want to do the opposite and used the following script:
#!/bin/bash
IPLIST="/Users/mymac/Desktop/list2.txt"
for IP in 'cat $IPLIST'; do
domain=$(dig -x $IP +short | head -1)
echo -e "$domain" >> results.csv
done < domainlist.txt
I would like to give the script a list of 1000+ IP addresses collected from a firewall log, and resolve the list of destination IP's to domains. I only want one entry in the response file since I will be adding this to the CSV I exported from the firewall as another "column" in Excel. I could even use multiple responses as semi-colon separated on one line (or /,|,\,* etc). The list2.txt is a standard ascii file. I have tried EOF in Mac, Linux, Windows.
216.58.219.78
206.190.36.45
173.252.120.6
What I am getting now:
The domainlist.txt is getting an exact duplicate of list2.txt while the results has nothing. No error come up on the screen when I run the script either.
I am running Mac OS X with Macports.
Your script has a number of syntax and stylistic errors. The minimal fix is to change the quotes around the cat:
for IP in `cat $IPLIST`; do
Single quotes produce a literal string; backticks (or the much preferred syntax $(cat $IPLIST)) performs a command substitution, i.e. runs the command and inserts its output. But you should fix your quoting, and preferably read the file line by line instead. We can also get rid of the useless echo.
#!/bin/bash
IPLIST="/Users/mymac/Desktop/list2.txt"
while read IP; do
dig -x "$IP" +short | head -1
done < "$IPLIST" >results.csv
Seems that in your /etc/resolv.conf you configured a nameserver which does not support reverse lookups and that's why the responses are empty.
You can pass the DNS server which you want to use to the dig command. Lets say 8.8.8.8 (Google) for example:
dig #8.8.8.8 -x "$IP" +short | head -1
The commands returns the domain with a . appended. If you want to replace that you can additionally pipe to sed:
... | sed 's/.$//'

Get Macbook screen size from terminal/bash

Does anyone know of any possible way to determine or glean this information from the terminal (in order to use in a bash shell script)?
On my Macbook Air, via the GUI I can go to "About this mac" > "Displays" and it tells me:
Built-in Display, 13-inch (1440 x 900)
I can get the screen resolution from the system_profiler command, but not the "13-inch" bit.
I've also tried with ioreg without success. Calculating the screen size from the resolution is not accurate, as this can be changed by the user.
Has anyone managed to achieve this?
I think you could only get the display model-name which holds a reference to the size:
ioreg -lw0 | grep "IODisplayEDID" | sed "/[^<]*</s///" | xxd -p -r | strings -6 | grep '^LSN\|^LP'
will output something like:
LP154WT1-SJE1
which depends on the display manufacturer. But as you can see the first three numbers in this model name string imply the display-size: 154 == 15.4''
EDIT
Found a neat solution but it requires an internet connection:
curl -s http://support-sp.apple.com/sp/product?cc=`system_profiler SPHardwareDataType | awk '/Serial/ {print $4}' | cut -c 9-` |
sed 's|.*<configCode>\(.*\)</configCode>.*|\1|'
hope that helps
The next script:
model=$(system_profiler SPHardwareDataType | \
/usr/bin/perl -MLWP::Simple -MXML::Simple -lane '$c=substr($F[3],8)if/Serial/}{
print XMLin(get(q{http://support-sp.apple.com/sp/product?cc=}.$c))->{configCode}')
echo "$model"
will print for example:
MacBook Pro (13-inch, Mid 2010)
Or the same without perl but more command forking:
model=$(curl -s http://support-sp.apple.com/sp/product?cc=$(system_profiler SPHardwareDataType | sed -n '/Serial/s/.*: \(........\)\(.*\)$/\2/p')|sed 's:.*<configCode>\(.*\)</configCode>.*:\1:')
echo "$model"
It is fetched online from apple site by serial number, so you need internet connection.
I've found that there seem to be several different Apple URLs for checking this info. Some of them seem to work for some serial numbers, and others for other machines.
e.g:
https://selfsolve.apple.com/wcResults.do?sn=$Serial&Continue=Continue&num=0
https://selfsolve.apple.com/RegisterProduct.do?productRegister=Y&country=USA&id=$Serial
http://support-sp.apple.com/sp/product?cc=$serial (last 4 digits)
https://selfsolve.apple.com/agreementWarrantyDynamic.do
However, the first two URLs are the ones that seem to work for me. Maybe it's because the machines I'm looking up are in the UK and not the US, or maybe it's due to their age?
Anyway, due to not having much luck with curl on the command line (The Apple sites redirect, sometimes several times to alternative URLs, and the -L option doesn't seem to help), my solution was to bosh together a (rather messy) PHP script that uses PHP cURL to check the serials against both URLs, and then does some regex trickery to report the info I need.
Once on my web server, I can now curl it from the terminal command line and it's bringing back decent results 100% of the time.
I'm a PHP novice so I won't embarrass myself by posting the script up in it's current state, but if anyone's interested I'd be happy to tidy it up and share it on here (though admittedly it's a rather long winded solution to what should be a very simple query).
This info really should be simply made available in system_profiler. As it's available through System Information.app, I can't see a reason why not.
Hi there for my bash script , under GNU/Linux : I make the follow to save
# Resolution Fix
echo `xrandr --current | grep current | awk '{print $8}'` >> /tmp/width
echo `xrandr --current | grep current | awk '{print $10}'` >> /tmp/height
cat /tmp/height | sed -i 's/,//g' /tmp/height
WIDTH=$(cat /tmp/width)
HEIGHT=$(cat /tmp/height)
rm /tmp/width /tmp/height
echo "$WIDTH"'x'"$HEIGHT" >> /tmp/Resolution
Resolution=$(cat /tmp/Resolution)
rm /tmp/Resolution
# Resolution Fix
and the follow in the same script for restore after exit from some app / game
in some S.O
This its execute command directly
ResolutionRestore=$(xrandr -s $Resolution)
But if dont execute call the variable with this to execute the varible content
$($ResolutionRestore)
And the another way you can try its with the follow for example
RESOLUTION=$(xdpyinfo | grep -i dimensions: | sed 's/[^0-9]*pixels.*(.*).*//' | sed 's/[^0-9x]*//')
VRES=$(echo $RESOLUTION | sed 's/.*x//')
HRES=$(echo $RESOLUTION | sed 's/x.*//')

Resources