Sorted ouput, needs to have text inserted between string - bash

I trying to add text (predefined) between a sorted output and saved to a new file.
I'm using a curl command to gather my info.
$ curl --user XXX:1234!## "http://......"
Then using grep to find IP addresses and sorting so they only appear once.
$ curl --user XXX:1234!## "http://......" | grep -E -o -m1 '([0-9]{1,3}[\.]){3}[0-9]{1,3}' | sort -u
I need to add <my_text_predefined> ([0-9]{1,3}[\.]){3}[0-9]{1,3} <my_text_predefined> between the regex ip address and then saved to a new file.
The script below only get my the ip address
$ curl --user XXX:1234!## "http://......" | grep -E -o -m1 '([0-9]{1,3}[\.]){3}[0-9]{1,3}' | sort -u
123.12.0.12
123.56.98.76

$ curl --user some_user:password "http://...." | grep -E -o -m1 '([0-9]{1,3}[\.]){3}[0-9]{1,3}' | sort -u | sed 's/.*/<prefix> -s & <suffix>/'

So if we need print some text for each IP ... try xargs
for i in {1..100}; do echo $i; done | xargs -n1 echo "Values are:"
if based on IP you would need to take decision put in a loop
for file $(curl ...) do ...
and check $file or do something with it ...

Related

Cron job creates empty files

I want to preface that I am a newbie that picked up shell scripting 2 weeks ago.
Hey guys I need help with something, hope someone can point me in the right direction. I have a script that works when I run it from the command line but every time I run it with a crontab, the output is a few empty files. Does anyone know why?
That's the code down there
#!/bin/bash
#Provide an IP address as an argument to use nmap
#make sure to add the full range with (0-225 or 0/24) at the end
IPADDRESS=$(hostname -I | awk '{print $1}')
network-scan(){
if [ $1 ]
then
sudo nmap -sn $1
else
sudo nmap -sn 192.168.1.0-255
fi
}
#Scan the whole network and only prints the IP addresses minus your own
#Sends the IP addresses to a file
network-scan | grep -i 'Nmap scan report' | \
sed 's/\ /\n/g'|sed 's/(//g'|sed 's/)//g' | \
grep '[0-9]*\.[0-9]*\.[0-9]*\.[0-9]*' | grep -v ${IPADDRESS} > ip_addresses
#Scan the whole network and only prints the MAC addresses
#Sends the MAC addresses to a file
network-scan | grep -i 'MAC Address:' | \
awk '{print $3}' > mac_addresses
#Put the IP and MAC addresses in the same file
paste ip_addresses mac_addresses | \
column -s $'\t' -t > "scan_$(date +%d-%m-%Y_%H:%M:%S)"
#Notify that a file with the IP and MAC addresses has been created on the Desktop
echo "A file containing the results of the scan has been created on the Desktop"
exit 0
You are using
network-scan | grep
without passing any parameter.
Hence network-scan function always using
sudo nmap -sn 192.168.1.0-255
when you run it from command line are you passing any parameter ?
echo $IPADDRESS inside the script when executing at cron and at command line for debugging.
network-scan | grep -i 'Nmap scan report' | \
sed 's/\ /\n/g'|sed 's/(//g'|sed 's/)//g' | \
grep '[0-9]*\.[0-9]*\.[0-9]*\.[0-9]*' | grep -v ${IPADDRESS}
Since you are obtaining empty output, validate each command and append(test) each OR operators to know where it is removing required output.

Sending grep data iteratively using curl for output of tail -f

I am using grep to extract data from a log file. The log file is dynamically updating new rows. and I need to send the grepped data to a REST endpoint using curl. This can be done easily for a static file but cannot find a solution fo a running log file. How can I realize this situation?
eg: tail -f | grep "<string>" > ~/<fileName>.log
The above can put the data in a file. Need to send it using a POST curl.
Maybe using a function like
send_data(){
curl -s -k -X POST –header Content-Type: application/json’ \
–header ‘Accept: application/json’ \
“http://${HOST}${PORT}/v1/notify” \
-d $1
}
If tail -f | grep "<string>" > ~/<fileName>.log is working for you then you could do:
tail -f file | stdbuf -i0 -o0 -e0 grep "<string>" | xargs -n 1 -d $'\n' curl ...
or:
while IFS= read -r line; do
curl ... "$line"
done < <(tail -f file | stdbuf -i0 -o0 -e0 grep "<string>")

Pipe grep response to a second command?

Here's the command I'm currently running:
curl 'http://test.com/?id=12345' | grep -o -P '(?<=content="2;url=).*?(?=")'
The response from this command is a URL, like this:
$ curl 'http://test.com/?id=12345' | grep -o -P '(?<=content="2;url=).*?(?=")'
http://google.com
I want to use whatever that URL is to essentially do this:
curl 'http://test.com/?id=12345' | grep -o -P '(?<=content="2;url=).*?(?=")' | curl 'http://google.com'
Is there any simple way to do this all in one line?
Use xargs with a place holder for the output from stdin with the -I{} flag as below. The -r flag is to ensure the curl command is not invoked on a empty output from previous grep output.
curl 'http://test.com/?id=12345' | grep -o -P '(?<=content="2;url=).*?(?=")' | xargs -r -I{} curl {}
A small description about the flags, -I and -r from the GNU xargs man page,
-I replace-str
Replace occurrences of replace-str in the initial-arguments with
names read from standard input.
-r, --no-run-if-empty
If the standard input does not contain any nonblanks, do not run
the command. Normally, the command is run once even if there is
no input. This option is a GNU extension
(or) if you are looking for a bash approach without other tools,
curl 'http://test.com/?id=12345' | grep -o -P '(?<=content="2;url=).*?(?=")' | while read line; do [ ! -z "$line" ] && curl "$line"; done

curl complex usage with pattern

I'm trying to get 2 files using curl based on some pattern but that doesn't seem to work:
Files:
SystemOut_15.04.01_21.12.36.log
SystemOut_15.04.01_15.54.05.log
curl -f -k -u "login:password" https://myserver/cgi-bin/logviewer/index.cgi?getlogfile=SystemOut_15.04.01_21.12.36.log'&'server=qwerty123.com'&'numlines=100000000'&'appenv=MBL%20-%20PROD'&'directory=/app/WAS/was85/profiles/node/logs/mbl-server1
I know there is -A key but it doesn't work since my file is inside the link.
How can I extract those 2 files using a pattern?
Did that myself. One curl gets the list of logs on the webpage. Another downloads those files.
The code looks like:
for file in $(curl -f -k -u "user:pwd" https://selfservice.pwj.com/cgi-bin/logviewer/index.cgi?listdirectory=/app/smx_client_mob/data/log'&'appenv=MBL%20-%20PROD'&'server=xshembl04pap.she.pwj.com | grep href | sed 's/.*href="//' | sed 's/".*//' | sed 's/javascript:getLog//g' | sed "s/['();]//g" | grep -i 'service' | grep '^[a-zA-Z].*'); do
curl -o $file -f -k -u "user:pwd" https://selfservice.pwj.com/cgi-bin/logviewer/index.cgi?getlogfile="$file"'&'server=xshembl04pap.she.pwj.com'&'numlines=100000000'&'appenv=MBL%20-%20PROD'&'directory=/app/smx_client_mob/data/log; done

Generating bash script arrays with elements containing spaces from commands

I have a script that logs in to a remote host to pull a directory listing to later present options to the user. It was all working perfectly, until some of the directories started having spaces in them. I have tried several syntaxes and googled the life out of this and I am now at the end of my tether. The original command was this:
SERVERDIRS=($(sshpass -p $PASS ssh -oStrictHostKeyChecking=no $USER#$SERVER ls -l --time-style="long-iso" $FROMFOLDER | egrep '^d' | awk '{print $8}'))
I first off changed this code to be able to read the spaces like this:
SERVERDIRS=($(sshpass -p $PASS ssh -oStrictHostKeyChecking=no $USER#$SERVER ls -l --time-style="long-iso" $FROMFOLDER | egrep '^d' | cut -d' ' -f8-))
However This resulted in each word being recognised as a variable. I have tried many ways to try to solve this, two of which were:
SERVERDIRS=($(sshpass -p $PASS ssh -oStrictHostKeyChecking=no $USER#$SERVER ls -d $FROMFOLDER* |rev| cut -d'/' -f1|rev|sed s/^/\"/g|sed s/$/\"/g))
SERVERDIRS=($(sshpass -p $PASS ssh -oStrictHostKeyChecking=no $USER#$SERVER ls -d $FROMFOLDER* |rev| cut -d'/' -f1|rev|sed 's/ /\\ /g'))
SERVERDIRS=(`sshpass -p $PASS ssh -oStrictHostKeyChecking=no $USER#$SERVER ls -d $FROMFOLDER* |rev| cut -d'/' -f1|rev|sed 's/ /\\ /g'`)
How can I resolve these directories in to separate elements correctly?
If you're trying to read one array value per line instead of space-separated, then $() syntax won't help. Try readarray (Bash 4):
readarray SERVERDIRS < <(sshpass -p $PASS ssh -oStrictHostKeyChecking=no $USER#$SERVER ls -l --time-style="long-iso" $FROMFOLDER | egrep '^d' | cut -d' ' -f8-)
or assign IFS and read with -d, -r, and -a set:
IFS=$'\n' read -d '' -r -a SERVERDIRS < <(sshpass -p $PASS ssh -oStrictHostKeyChecking=no $USER#$SERVER ls -l --time-style="long-iso" $FROMFOLDER | egrep '^d' | cut -d' ' -f8-)
or, really, any other answer to this SO question.
If you're unfamiliar with <() syntax, it's known as process substitution and will allow your variable to be set in your current environment rather than the instantly-discarded subshell that a pipe would create.
Bear in mind that this process is a little dangerous; filenames can also contain newlines, so it's usually much preferred to use find ... -print0.
If you only need to list directories, try this
ls -d /usr/local/src/*/
or
ls -d /path/to/your/directory/*/
You can then loop through all directories
#!/bin/bash
aa=`ls -d /usr/local/src/*/`
for dir in "${aa}[#]"
do
echo "$dir"
done
This works if dir names contain spaces.

Resources