I am trying to take a nmap scan result, determine the http ports (http, https, http-alt ...) and capture them ip and ports in order to automaticly perform web app scans.
I have my nmap results in grepable format. Using grep to delete any lines that do no contain the string "http". But I am now unsure how I can proceed.
Host: 127.0.0.1 (localhost) Ports: 3390/open/tcp//dsc///, 5901/open/tcp//vnc-1///, 8000/open/tcp//http-alt/// Ignored State: closed (65532)
This is my current result. From this I can get the IP of hosts with a http server open by using the cut command and getting the second field. which is the first part of my problem solved.
But now I am looking for a way to only get (from the above example)
8000/open/tcp//http-alt///
(NB: I'm not looking to get it just for the spefic case, using
cut -f 3 -d "," will work for this case, but if the http server was in the first field it would not work.)
after which i can use the cut command to get the port to then add it to a file with the ip, resulting in
127.0.0.1:8000
Could anyone advise a good way to do this?
Code of my simple bash script for doing a basic scan of all ports,the then doing a more advanced one based on the open ports found. Next step and objecive is to automaticly scan web apps with a directory scan and niktoo scan of identified web apps
#!/bin/bash
echo "Welcome to the quick lil tool. This runs a basic nmap scan, collects open ports and does a more advanced scan. reducing the time needed"
echo -e "\nUsage: ./getPorts.sh [Hosts]\n"
if [ $# -eq 0 ]
then
echo "No argument specified. Usage: ./getPorts.sh [Host or host file]"
exit 1
fi
if [[ "$EUID" -ne 0 ]]; then
echo "Not running as root"
exit 1
fi
nmap -iL $1 -p- -oA results
#Replace input file with gnmap scan, It will generate a list of all open ports
cat results.gnmap |awk -F'[/ ]' '{h=$2; for(i=1;i<=NF;i++){if($i=="open"){print h,":",$(i-1)}}}'| awk -F ':' '{print $2}' | sed -z 's/\n/,/g;s/,$/\n/' >> ports.list
#more advanced nmap scan
ports=$(cat ports.list)
echo $ports
nmap -p $ports -sC -sV -iL $1
EDIT: Found a way. Not sure why I was so focused on using the gnmap format for this, If I use the regular .nmap format. I can simple grep the line with http in and use cut to get the first field.
(cat results.nmap | grep 'http' | cut -d "/" -f 1)
EDIT2: I realised the method mentioned in my first edit is not optimal when processing multiple results as I then have a list of IP's from the .nmap, and a list of ports from the .gnmap. I have found a good solution to my problem using a single file. see below:
#!/bin/bash
httpalt=$(cat test.gnmap | awk '/\/http-alt\// {for(i=5;i<=NF;i++)if($i~"/open/.+/http-alt/"){sub("/.*","",$i); print "http://"$2":"$i}}')
if [ -z "$httpalt" ]
then
echo "No http-alt servers found"
else
echo "http-alt servers found"
echo $httpalt
printf "\n"
fi
http=$(cat test.gnmap | awk '/\/http\// {for(i=5;i<=NF;i++)if($i~"/open/.+/http/"){sub("/.*","",$i);print "http://"$2":"$i}}')
if [ -z "$http" ]
then
echo "No http servers found"
else
echo "http servers found"
echo $http
printf "\n"
fi
https=$(cat test.gnmap | awk '/\/https\// {for(i=5;i<=NF;i++)if($i~"/open/.+/https/"){sub("/.*","",$i); print "https://"$2":"$i}}')
if [ -z "$https" ]
then
echo "No http servers found"
else
echo "https servers found"
echo $https
printf "\n"
fi
echo ----
printf "All ip:webapps \n"
webserver=$(echo "$httpalt $http $https" | sed -e 's/\s\+/,/g'|sed -z 's/\n/,/g;s/,$/\n/')
if [[ ${webserver::1} == "," ]]
then
webserver="${webserver#?}"
else
echo 0; fi
for webservers in $webserver; do
echo $webservers
done
echo $https
https=$(echo "$https" | sed -e 's/\s\+/,/g'|sed -z 's/\n/,/g;s/,$/\n/')
echo $https
mkdir https
mkdir ./https/nikto/
mkdir ./https/dirb/
for onehttps in ${https//,/ }
do
echo "Performing Dirb and nikto for https"
dirb $onehttps > ./https/dirb/https_dirb
nikto -url $onehttps > ./https/nikto/https_nitko
done
mkdir http
mkdir ./http/nikto
mkdir ./http/dirb/
for onehttp in ${http//,/ }
do
echo $onehttp
echo "Performing Dirb for http"
dirb $onehttp >> ./http/dirb/http_dirb
nikto -url $onehttp >> ./http/nikto/http_nikto
done
mkdir httpalt
mkdir httpalt/nikto/
mkdir httpalt/dirb/
for onehttpalt in ${httpalt//,/ }
do
echo "Performing Dirb for http-alt"
dirb $onehttpalt >> ./httpalt/dirb/httpalt_dirb
nikto -url $onehttpalt >> ./httpalt/nikto/httpalt_nikto
done
This will check for any http, https, and http-alt servers, store them in a variable, check for duplicates and remove any trailing commas at the begining, It is far from perfect, but is a good solution for now!
Just want to share a brilliant open source tool on GitHub that can be used to easily parse NMAP XML files.
https://github.com/honze-net/nmap-query-xml
I use some of the python code to extract http/https URLs from the nmap xml file.
# pip3 install python-libnmap
from libnmap.parser import NmapParser
def extract_http_urls_from_nmap_xml(file):
try:
report = NmapParser.parse_fromfile(file)
urls = []
except IOError:
print("Error: Nmap XML file %s not found. Quitting!" % file)
sys.exit(1)
for host in report.hosts:
for service in host.services:
filtered_services = "http,http-alt,http-mgmt,http-proxy,http-rpc-epmap,https,https-alt,https-wmap,http-wmap,httpx"
if (service.state == "open") and (service.service in filtered_services.split(",")):
line = "{service}{s}://{hostname}:{port}"
line = line.replace("{xmlfile}", nmap_file)
line = line.replace("{hostname}", host.address if not host.hostnames else host.hostnames[0]) # TODO: Fix naive code.
line = line.replace("{hostnames}", host.address if not host.hostnames else ", ".join(list(set(host.hostnames)))) # TODO: Fix naive code.
line = line.replace("{ip}", host.address)
line = line.replace("{service}", service.service)
line = line.replace("{s}", "s" if service.tunnel == "ssl" else "")
line = line.replace("{protocol}", service.protocol)
line = line.replace("{port}", str(service.port))
line = line.replace("{state}", str(service.state))
line = line.replace("-alt", "")
line = line.replace("-mgmt", "")
line = line.replace("-proxy", "")
line = line.replace("-rpc-epmap", "")
line = line.replace("-wmap", "")
line = line.replace("httpx", "http")
urls.append(line)
return list(dict.fromkeys(urls))
printf "Host: 127.0.0.1 (localhost) Ports: 3390/open/tcp//dsc///, 5901/open/tcp//vnc-1///, 8000/open/tcp//http-alt/// Ignored State: closed (65532)" > file
cat file | tr -s ' ' | tr ',' '\n' | sed s'#^ ##g' > f2
string=$(sed -n '3p' f2 | cut -d' ' -f1)
It is only horizontal search which is difficult; vertical is easy. You can get any string out of any text you like, as long as you can get the string on its' own line, and then determine which line you need to print.
You only need complex regular expressions if you are relying exclusively on horizontal search. In almost all cases, as long as your substring is on its' own line, cut can take you the rest of the way.
I'm creating a tool to parse an input file to re-construct a mailx command (server that creates the data for mailx cannot send emails, so I need to store data into a file so another server can rebuild the command and send it). I could output the whole command to a file and execute the file on the other server, but that's hardly secure / safe.... anyone could intercept the file and insert malicious stuff that would be run as root - this parsing tool is checking every minute for any files to parse and email using a systemd timer and service.
I have created the file, using 'markers / separators' with this format:
-------MESSAGE START-------
Email Body Text
Goes Here
-------MESSAGE END-------
-------SUBJECT START-------
Email Subject
-------SUBJECT END-------
-------ATTACHEMENT START-------
path to file to attach if supplied
-------ATTACHEMENT END-------
-------S OPTS START-------
list of mailx '-S' options eg from=EMAILNAME <email#b.c> or sendwait etc each one on a new line
-------S OPTS END-------
-------EMAIL LIST START-------
string of recipient emails comma separated eg. email1,email2,email3 etc..
-------EMAIL LIST END-------
And I have a program to parse this file and rebuild, and run the mailx command:
#!/bin/bash
## Using systemd logging to journal for this as its now being called as part of a service
## See: https://serverfault.com/questions/573946/how-can-i-send-a-message-to-the-systemd-journal-froma-the-command-line (kkm answer)
start_time="$(date +[%c])"
exec 4>&2 2> >(while read -r REPLY; do printf >&4 '<3>%s\n' "$REPLY"; done)
echo >&4 "<5>$start_time -- Started gpfs_flag_email.sh"
trap_exit(){
exec >2&
}
trap 'trap_exit' EXIT
email_flag_path="<PATH TO LOCATION>/email_flags/"
mailx_message_start="-------MESSAGE START-------"
mailx_message_end="-------MESSAGE END-------"
mailx_subject_start="-------SUBJECT START-------"
mailx_subject_end="-------SUBJECT END-------"
mailx_attachement_start="-------ATTACHEMENT START-------"
mailx_attachement_end="-------ATTACHEMENT END-------"
mailx_s_opts_start="-------S OPTS START-------"
mailx_s_opts_end="-------S OPTS END-------"
mailx_to_email_start="-------EMAIL LIST START-------"
mailx_to_email_end="-------EMAIL LIST END-------"
no_attachment=false
no_additional_opts=false
additional_args_switch="-S "
num_files_in_flag_path="$(find $email_flag_path -type f | wc -l)"
if [[ $num_files_in_flag_path -gt 0 ]]; then
for file in $email_flag_path*; do
email_message="$(awk "/$mailx_message_start/,/$mailx_message_end/" $file | egrep -v -- "$mailx_message_start|$mailx_message_end")"
email_subject="$(awk "/$mailx_subject_start/,/$mailx_subject_end/" $file | egrep -v -- "$mailx_subject_start|$mailx_subject_end")"
email_attachment="$(awk "/$mailx_attachement_start/,/$mailx_attachement_end/" $file | egrep -v -- "$mailx_attachement_start|$mailx_attachement_end")"
email_additional_opts="$(awk "/$mailx_s_opts_start/,/$mailx_s_opts_end/" $file | egrep -v -- "$mailx_s_opts_start|$mailx_s_opts_end")"
email_addresses="$(awk "/$mailx_to_email_start/,/$mailx_to_email_end/" $file | egrep -v -- "$mailx_to_email_start|$mailx_to_email_end" | tr -d '\n')"
if [[ -z "$email_message" || -z "$email_subject" || -z "$email_addresses" ]]; then
echo >&2 "MISSING DETAILS IN INPUT FILE $file.... Exiting With Error"
exit 1
fi
if [[ -z "$email_attachment" ]]; then
no_attachment=true
fi
if [[ -z "$email_additional_opts" ]]; then
no_additional_opts=true
else
additional_opts_string=""
while read -r line; do
if [[ ! $line =~ [^[:space:]] ]]; then
continue
else
additional_opts_string="$additional_opts_string \"${additional_args_switch} '$line'\""
fi
done <<<"$(echo "$email_additional_opts")"
additional_opts_string="$(echo ${additional_opts_string:1} | tr -d '\n')"
fi
if [[ $no_attachment = true ]]; then
if [[ $no_additional_opts = true ]]; then
echo "$email_message" | mailx -s "$email_subject" $email_addresses
else
echo "$email_message" | mailx -s "$email_subject" $additional_opts_string $email_addresses
fi
else
if [[ $no_additional_opts = true ]]; then
echo "$email_message" | mailx -s "$email_subject" -a $email_attachment $email_addresses
else
echo "$email_message" | mailx -s "$email_subject" -a $email_attachment $additional_opts_string $email_addresses
fi
fi
done
fi
find $email_flag_path -type f -delete
exit 0
There is however an issue with the above that I just can work out..... the -S opts completely screw up the email headers and I end up with emails being sent to the wrong people (I have set a reply-to and from options, but the email header is jumbled and the reply-to email ends up in the to: field) like this:
To: Me <a#b.com>, sendwait#a.lan , -S#a.lan <-s#a.lan>, another-email <another#b.com>
All I'm trying to do is rebuild the command as if I'd typed it in the CLI:
echo "EMAIL BODY MESSAGE" | mailx -s "EMAIL SUBJECT" -S "from=EMAILNAME <email#b.c>" -S "replyto=EMAILNAME <email#b.c>" -S sendwait my.email#b.com
I've tried quoting in ' ' and " " quoting the other mailx parameters around it etc etc... I have written other tools that pass variables as input arguments so I just cannot understand how I'm screwing this up.....
Any help would be appreciated...
EDIT
Thanks to Gordon Davisson's really helpful comments I was able to not only fix it but understand the fix as well using an array and appropriately quoting the variables... the tip about using printf was really really helpful in helping me understand what I was doing wrong and how to correct it :P
declare -a mailx_args_array
...
num_files_in_flag_path="$(find $email_flag_path -type f | wc -l)"
if [[ $num_files_in_flag_path -gt 0 ]]; then
for file in $email_flag_path*; do
....
mailx_args_array+=( -s "$email_subject" )
if [[ ! -z "$email_attachment" ]]; then
mailx_args_array+=( -a "$email_attachment" )
fi
if [[ ! -z "$email_additional_s_opts" ]]; then
while read -r s_opt_line; do
mailx_args_array+=( -S "$s_opt_line" )
done < <(echo "$email_additional_s_opts")
fi
mailx_args_array+=( "$email_addresses" )
echo "$email_message" | mailx "${mailx_args_array[#]}"
done
fi
I wrote a little bash script I want to start automatically when I log in to my desktop.
The script shall always run in background and periodically check for new incoming mail messages. When a new message arrives, the script shall pop up a notification via notify-send, and display its content.
However, if I send myself an email (from another address) to check if it's working, it seems that the message has already been consumed, even if in the Gmail's web interface (that I keep closed) the message is marked as unread. Obviously, since even the new message is not marked as new, the script doesn't fetch it.
I also switched off my android phone, because I think that it could interfere, and I'm sure I have no other mail clients running.
The output of the script is the following; note that between these two lines I send myself a message:
254 messages, 0 new
255 messages, 0 new
The code follows:
#!/bin/bash
SERVER="imap.gmail.com"
while :
do
echo "1 login myusername mypassword" > /tmp/checkmail
echo "2 select inbox" >> /tmp/checkmail
echo "3 logout" >> /tmp/checkmail
response=$(openssl s_client -crlf -connect $SERVER:993 -quiet 2> /dev/null < /tmp/checkmail)
rm /tmp/checkmail
news=$(echo "$response" | grep RECENT | awk '{print $2}')
last=$(echo "$response" | grep EXISTS | awk '{print $2}')
echo "$last messages, $news new"
if [ "$news" != "0" ]
then
for (( i=0; i<$news; i++))
do
echo "fetching $i° message"
echo "1 login myusername mypassword" > /tmp/getmail
echo "2 select inbox" >> /tmp/getmail
echo "3 fetch $((last-i)) (body[1])" >> /tmp/getmail
echo "4 logout" >> /tmp/getmail
response=$(openssl s_client -crlf -connect $SERVER:993 -quiet 2> /dev/null < /tmp/getmail)
rm /tmp/getmail
content=$(echo "$response" | awk '/FETCH/{flag=1;next}/3 OK Success/{flag=0}flag')
notify-send -t 0 "New message" "$content"
done
fi
sleep 60
done
Thank you in advance
I experienced the very same problem, with the gmail imap server. I have therefore resorted to modify your script adding one extra IMAP command:
echo "1 login $myusername $mypassword" > /tmp/checkmail
echo "2 select inbox" >> /tmp/checkmail
echo "3 STATUS inbox (UNSEEN)" >> /tmp/checkmail
echo "4 logout" >> /tmp/checkmail
and then replacing your definition of $news with the one below
news=$(echo "$response" | grep UNSEEN | awk '{print $5}' | sed 's/)//' | tr -d '\r')
Note that the removal of '\r' was required for the forthcoming script to work.
I am nonetheless still struggling to make the next steps of your original script working on my setup.
Iam trying to build a script to monitor any modifications in files in my FTP site. The script is given below. I have used wc -l to count the no. of files in the directory and have defined the constant value of files if there is going to be any modification in files like if my co-worker updates in my FTP this will send me a notification. Am trying to cron this to achieve. But the script actually just hangs after the count . It doesn't provide me the expected result. Is there anything that am wrong with the code. Am just a beginner in Bash could anyone help me solve this
#!/usr/bin/bash
curl ftp://Sterst:abh89TbuOc#############################/Test/| wc -l ;
read b;
a=9
if [ "$b" != "$a" ];
then
echo "FTP dir has modified mail" -s "dir notification" sni912#######.com;
fi
A couple of notes about your code:
#!/usr/bin/bash
curl ftp://Sterst:abh89TbuOc#############################/Test/| wc -l ;
read b;
That does not do what you think it does: the wc output goes to stdout, not into the read command. Do this instead: b=$( curl ... | wc -l )
a=9
if [ "$b" != "$a" ];
Since the wc output will have some extra whitespace, better to do a numeric comparison:
if (( a != b ))
then
echo "FTP dir has modified mail" -s "dir notification" sni912#######.com;
You probably mean this:
echo "FTP dir has modified" | mail -s "dir notification" sni912#######.com;
I would write this:
listing=$( curl ftp://... )
num=$( echo "$listing" | wc -l )
if (( num != 9 )); then
mail -s "ftp dir modification" email#example.com <<END
FTP directory modification
$listing
END
fi
My code
#!/usr/local/bin/bash
listing=$( curl ftp://username:password#ftp.com/test/ )
num=$( wc -l | echo "$listing" )
if (( num != 9 ));
then
mail -s "ftp dir modification" email#example.com
fi `
I am Currently trying to make a shell script that will email me and other receipients if the website says it is down for maintenance. Curently i am trying to use curl and grep to pipe to a variable if grep see's the Phrase "Down for Maintenance" but when even when the website does not say that it it still outputs information. I want to make it so if the phrase exists it will make a vairiable true else it is false and just exits. Btw this is for a cronjob.
Here is what i have come up with so far. P.S sorry for being such a noob.
## Sends an email if the website is down for maintanance
#RESPONSE = ''
curl websiteaddress.com | grep "Down for Maintenance" | read RESPONSE
if $RESPONSE
then
echo "Website is Down" | mail -s "Website is down for maintenance" email#address.com
end else
exit
Change:
curl websiteaddress.com | grep "Down for Maintenance" | read RESPONSE
if $RESPONSE
To:
curl websiteaddress.com | grep -q "Down for Maintenance"
if [ $? -eq 0 ] ; then
echo "Website is Down" | mail -s "Website is down for maintenance" email#address.com
; fi
The grep -q tells grep to operate in "quiet mode", meaning it won't output anything. Instead it will exit with a return code of zero if a match was found, 1 if no match was found. the if [ $? -eq 0 ] checks the return code of grep.
Your RESPONSE is initialized correctly, but in another process - because of the pipe.
In your calling script it is unkonwn.
If you need the RESPONSE, you can do the following:
read RESPONSE <<< $(curl websiteaddress.com | grep -q "Down for Maintenance")