write shell script copy one file to number of servers - shell

i am searching in google but i can.t find that.i want a Successful shell script and using for loop.most of the case fails for searching this things.enter image description here

You can two scripts:
1. Server List, this can contain a list of destination hostnames each one on a new line.
2. A copy script, which can basically cat the above server list and then execute scp command to copy the file for the same. It can also accept parameters if your server list is different per every application. Below is a sample:
Usage()
{
echo "Usage: $0 [-a application] [-l level]"
echo " where application = {a, b, c , d }"
exit 1;
}
SERVER_LIST=a.txt
for HOST in `cat $SERVER_LIST | grep -v ^# | cut -d: -f2`
do
spawn /usr/bin/scp FILE user#$HOST:destinationDirectory
expect {
"*password:*" { send $PASSWORD\r;interact }
}
exit
"
done

Related

Bash script sshd group check

hello I have script written in python. This script is saving results output of bash script modules. I have to written module that check that are user in
cat /etc/group |grep sshd and write in excel
in group 203. When i execute command command in host i recveive output
staff:!:,sshd,sosnix,xo29321,siwja8211,293912,29314
sshd:!:203:sshd
In this module "compilant" and "actual_value" this is row's from excel
My code of module
module_id="XX.TEST"
echo " === $module_id module === "
command_output=`cat /etc/group |grep sshd`
if [ "$command_output" = "cat /etc/group |grep sshd" ]; then
compliant="Yes"
actual_value="GUID 203"
else
compliant="No"
actual_value="N/A"
fi
# SCRIPT RESULT
echo :::$module_id:::$compliant:::$actual_value:::
echo " === End of $module_id module === "
and this script write in my excel result No In compilant and N/A in actual value
This is correct behavior.
You compare two strings: first string with result of cat /etc/group |grep sshd execution and second string the command itself - "cat /etc/group |grep sshd".
These strings are not equivalent. So 'If' goes by 'else' branch and you got the mentioned output.
Please refer for https://www.gnu.org/software/bash/manual/bash.html "6.4 Bash Conditional Expressions" for more information.

Re-direct output of a shell to a txt file

I have a script written and I want to include a function in the script, that silently logs the console output to a .txt file. The printf used in my shell scripts have colors for certain characters.
A sample:
# Color block
G="\033[32m"
N="\033[0m"
R="\033[31m"
Y="\033[33m"
# MCS Check
mcs=$(cat /home/admin/service-health.txt | grep -i mcs | cut -d ' ' -f 5 | tr . " ")
if [ "$mcs" == "up " ]
then
printf "${Y}MCS${N} Service Status is\t\t |${G}UP${N}\n"
else
printf "${Y}MCS${N} Service Status is\t\t |${R}DOWN${N}\n"
fi
Console output for this will display the color.
This is not mandatory in the .txt logging.
I will then be emailing this .txt to an address using:
sendmail $vdp $eaddr < /home/admin/health-check.txt
I used this block as I want to redirect the output within the script itself:
sudo touch /home/admin/health-check.txt
exec > >(tee -i /home/admin/health-check.txt)
exec 2>&1
But since this is a colored output, I keep getting this in my email:
[33mGSAN[0m Service Status is |[32mUP[0m
[33mMCS[0m Service Status is |[32mUP[0m
[33mTomcat[0m Service Status is |[32mUP[0m
[33mScheduler[0m Service Status is |[32mUP[0m
[33mMaintenance[0m Service Status is |[32mUP[0m
VDP [33mAccess State[0m is |[32mFULL[0m
Any thoughts about stripping colors during redirect? I do not want to use sed to find and replace as this looks tedious.
Thanks.
You can direct the output using the > character. printf "mytext" > out.txt will print "mytext" to the file "out.txt"

Persistent AWK Program

I have been tasked with writing a BASH script to filter log4j files and pipe them over netcat to another host. One of the requirements is that the script must keep track of what it has already sent to the server and not send it again due to licensing constraints on the receiving server (the product on the server is licensed on a data-per-day model).
To achieve the filtering I'm using AWK encapsulated in a BASH script. The BASH component works fine - it's the AWK program that's giving me grief when I try to get it to remember what has already been sent to the server. I am doing this by grabbing the time stamp of a line each time a line matches my pattern. At the end of the program the last time stamp is written to a hidden file in current working directory. On successive runs of the program AWK will read this file in to a variable. Now each time a line matches the pattern it's time stamp is also compared to the one in the variable. If it is newer it is printed, otherwise it is not.
Desired Output:
INFO 2012-11-07 09:57:12,479 [[artifactid].connector.http.mule.default.receiver.02] org.mule.api.processor.LoggerMessageProcessor: MsgID=5017f1ff-1dfa-48c7-a03c-ed3c29050d12 InteractionStatus=Accept InteractionDateTime=2012-08-07T16:57:33.379+12:00 Retailer=CTCT RequestType=RemoteReconnect
Hidden File:
2012-10-11 12:08:19,918
So that's the theory, now my issue.
The script works fine for contrived/ trivial examples such as:
INFO 2012-11-07 09:57:12,479 [[artifactid].connector.http.mule.default.receiver.02] org.mule.api.processor.LoggerMessageProcessor: MsgID=5017f1ff-1dfa-48c7-a03c-ed3c29050d12 InteractionStatus=Accept InteractionDateTime=2012-08-07T16:57:33.379+12:00 Retailer=CTCT RequestType=RemoteReconnect
However, if I run it over a full blown log file with stack traces etc in it then the indentation levels appear to wreck havoc on my program. The first run of the program will produce the desired results - matching lines will be printed and the latest time stamp written to the hidden file. Running it again is when the problem crops up. The output of the program contains the indented lines from stack traces etc (see the block below) and I can't figure out why. This then stuffs the hidden file as the last matching line doesn't contain a time stamp and some garbage is written to it making any further runs pointless.
Undesired output:
at package.reverse.domain.SomeClass.someMethod(SomeClass.java:233)
at package.reverse.domain.processor.SomeClass.process(SomeClass.java:129)
at package.reverse.domain.processor.someClass.someMethod(SomeClassjava:233)
at package.reverse.domain.processor.SomeClass.process(SomeClass.java:129)
Hidden file after:
package.reverse.domain.process(SomeClass.java:129)
My awk program:
FNR == 1 {
CMD = "basename " FILENAME
CMD | getline FILE;
FILE = "." FILE ".last";
if (system("[ -f "FILE" ]") == 0) {
getline FIRSTLINE < FILE;
close(FILE);
print FIRSTLINE;
}
else {
FIRSTLINE = "1970-01-01 00:00:00,000";
}
}
$0 ~ EXPRESSION {
if (($2 " " $3) > FIRSTLINE) {
print $0;
LASTLINE=$2 " " $3;
}
}
END {
if (LASTLINE != "") {
print LASTLINE > FILE;
}
}
Any assistance with finding out why this is happening would be greatly appreciated.
UPDATE:
BASH Script:
#!/bin/bash
while getopts i:r:e:h:p: option
do
case "${option}"
in
i) INPUT=${OPTARG};;
r) RULES=${OPTARG};;
e) PATFILE=${OPTARG};;
h) HOST=${OPTARG};;
p) PORT=${OPTARG};;
?) printf "Usage: %s: -i <\"file1.log file2.log\"> -r <\"rules1.awk rules2.awk\"> -e <\"patterns.pat\"> -h <host> -p <port>\n" $0;
exit 1;
esac
done
#prepare expression with sed
EXPRESSION=`cat $PATFILE | sed ':a;N;$!ba;s/\n/|/g'`;
EXPRESSION="^(INFO|DEBUG|WARNING|ERROR|FATAL)[[:space:]]{2}[[:digit:]]{4}\\\\-[[:digit:]]{1,2}\\\\-[[:digit:]]{1,2}[[:space:]][[:digit:]]{1,2}:[[:digit:]]{2}:[[:digit:]]{2},[[:digit:]]{3}.*"$EXPRESSION".*";
#Make sure the temp file is empty
echo "" > .temp;
#input through awk.
for file in $INPUT
do
awk -v EXPRESSION="$EXPRESSION" -f $RULES $file >> .temp;
done
#send contents of file to splunk indexer over udp
cat .temp;
#cat .temp | netcat -t $HOST $PORT;
#cleanup temporary files
if [ -f .temp ]
then
rm .temp;
fi
Patterns File (The stuff I want to match):
Warning
Exception
Awk script as above.
Example.log
info 2012-09-04 16:00:11,638 [[adr-com-adaptor-stub].connector.http.mule.default.receiver.02] nz.co.amsco.interop.multidriveinterop: session not initialised
error 2012-09-04 16:00:11,639 [[adr-com-adaptor-stub].connector.http.mule.default.receiver.02] nz.co.amsco.adrcomadaptor.processor.comadaptorprocessor: nz.co.amsco.interop.exceptions.systemdownexception
nz.co.amsco.interop.exceptions.systemdownexception
at nz.co.amsco.adrcomadaptor.processor.comadaptorprocessor.getdeviceconfig(comadaptorprocessor.java:233)
at nz.co.amsco.adrcomadaptor.processor.comadaptorprocessor.process(comadaptorprocessor.java:129)
at org.mule.processor.chain.defaultmessageprocessorchain.doprocess(defaultmessageprocessorchain.java:99)
at org.mule.processor.chain.abstractmessageprocessorchain.process(abstractmessageprocessorchain.java:66)
at org.mule.processor.abstractinterceptingmessageprocessorbase.processnext(abstractinterceptingmessageprocessorbase.java:105)
at org.mule.processor.asyncinterceptingmessageprocessor.process(asyncinterceptingmessageprocessor.java:90)
at org.mule.processor.chain.defaultmessageprocessorchain.doprocess(defaultmessageprocessorchain.java:99)
at org.mule.processor.chain.abstractmessageprocessorchain.process(abstractmessageprocessorchain.java:66)
at org.mule.processor.AbstractInterceptingMessageProcessorBase.processNext(AbstractInterceptingMessageProcessorBase.java:105)
at org.mule.interceptor.AbstractEnvelopeInterceptor.process(AbstractEnvelopeInterceptor.java:55)
at org.mule.processor.AbstractInterceptingMessageProcessorBase.processNext(AbstractInterceptingMessageProcessorBase.java:105)
Usage:
./filter.sh -i "Example.log" -r "rules.awk" -e "patterns.pat" -h host -p port
Note that host and port are both unused in this version as the output is just thrown onto stdout.
So if I run this I get the following output:
info 2012-09-04 16:00:11,638 [[adr-com-adaptor-stub].connector.http.mule.default.receiver.02] nz.co.amsco.interop.multidriveinterop: session not initialised
error 2012-09-04 16:00:11,639 [[adr-com-adaptor-stub].connector.http.mule.default.receiver.02] nz.co.amsco.adrcomadaptor.processor.comadaptorprocessor: nz.co.amsco.interop.exceptions.systemdownexception
at nz.co.amsco.adrcomadaptor.processor.comadaptorprocessor.getdeviceconfig(comadaptorprocessor.java:233)
at nz.co.amsco.adrcomadaptor.processor.comadaptorprocessor.process(comadaptorprocessor.java:129)
If I run it again on the same unchanged file I should get no output however I am seeing:
nz.co.amsco.adrcomadaptor.processor.comadaptorprocessor.process(comadaptorprocessor.java:129)
I have been unable to determine why this is happening.
You didn't provide any sample input that could reproduce your problem so let's start by just cleaning up your script and go from there. Change it to this:
BEGIN{
expression = "^(INFO|DEBUG|WARNING|ERROR|FATAL)[[:space:]]{2}[[:digit:]]{4}-[[:digit:]]{1,2}-[[:digit:]]{1,2}[[:space:]][[:digit:]]{1,2}:[[:digit:]]{2}:[[:digit:]]{2},[[:digit:]]{3}.*Exception|Warning"
# Do you really want "(Exception|Warning)" in brackets instead?
# As written "Warning" on its own will match the whole expression.
}
FNR == 1 {
tstampFile = "/" FILENAME ".last"
sub(/.*\//,".",tstampFile)
if ( (getline prevTstamp < tstampFile) > 0 ) {
close(tstampFile)
print prevTstamp
}
else {
prevTstamp = "1970-01-01 00:00:00,000"
}
nextTstamp = ""
}
$0 ~ expression {
currTstamp = $2 " " $3
if (currTstamp > prevTstamp) {
print
nextTstamp = currTstamp
}
}
END {
if (nextTstamp != "") {
print nextTstamp > tstampFile
}
}
Now, do you still have a problem? If so, show us how you run the script, i.e. the bash command you are executing, and post some small sample input that reproduces your problem.

Bash add to end of file (>>) if not duplicate line

Normally I use something like this for processes I run on my servers
./runEvilProcess.sh >> ./evilProcess.log
However I'm currently using Doxygen and it produces lots of duplicate output
Example output:
QGDict::hashAsciiKey: Invalid null key
QGDict::hashAsciiKey: Invalid null key
QGDict::hashAsciiKey: Invalid null key
So you end up with a very messy log
Is there a way I can only add the line to the log file if the line wasn't the last one added.
A poor example (but not sure how to do in bash)
$previousLine = ""
$outputLine = getNextLine()
if($previousLine != $outputLine) {
$outputLine >> logfile.log
$previousLine = $outputLine
}
If the process returns duplicate lines in a row, pipe the output of your process through uniq:
$ ./t.sh
one
one
two
two
two
one
one
$ ./t.sh | uniq
one
two
one
If the logs are sent to the standard error stream, you'll need to redirect that too:
$ ./yourprog 2>&1 | uniq >> logfile
(This won't help if the duplicates come from multiple runs of the program - but then you can pipe your log file through uniq when reviewing it.)
Create a filter script (filter.sh):
while read line; do
if [ "$last" != "$line" ]; then
echo $line
last=$line
fi
done
and use it:
./runEvilProcess.sh | sh filter.sh >> evillog

Compare a file's contents with command output, then execute command and append file

1. File
A file /etc/ssh/ipblock contains lines that look like this:
2012-01-01 12:00 192.0.2.201
2012-01-01 14:15 198.51.100.123
2012-02-15 09:45 192.0.2.15
2012-03-12 21:45 192.0.2.14
2012-04-25 00:15 203.0.113.243
2. Command
The output of the command iptables -nL somechain looks like this:
Chain somechain (2 references)
target prot opt source destination
DROP all -- 172.18.1.4 anywhere
DROP all -- 198.51.100.123 anywhere
DROP all -- 172.20.4.16 anywhere
DROP all -- 192.0.2.125 anywhere
DROP all -- 172.21.1.2 anywhere
3. The task at hand
First I would like to get a list A of IP addresses that are existent in the iptables chain (field 4) but not in the file.
Then I would like to get a list B of IP addresses that are existent in the file but not in the iptables chain.
IP addresses in list A should then be appended to the file in the same style (date, time, IP)
IP addresses in list B should then be added to the iptables chain with
iptables -A somechain -d IP -j DROP
4. Background
I was hoping to expand my awk-fu so I have been trying to get this to work with an awk script that can be executed without arguments. But I failed.
I know I can get the output from commands with the getline command so I was able to get the time and date that way. And I also know that one can read a file using getline foo < file. But I have only had many failed attempts to combine this all into a working awk script.
I realise that I could get this to work with an other programming language or a shell script. But can this be done with an awk script that can be ran without arguments?
I think this is almost exactly what you were looking for. Does the job, all in one file, code I guess is pretty much self-explanatory...
Easily adaptable, extendable...
USAGE:
./foo.awk CHAIN ip.file
foo.awk:
#!/usr/bin/awk -f
BEGIN {
CHAIN= ARGV[1]
IPBLOCKFILE = ARGV[2]
while((getline < IPBLOCKFILE) > 0) {
IPBLOCK[$3] = 1
}
command = "iptables -nL " CHAIN
command |getline
command |getline
while((command |getline) > 0) {
IPTABLES[$4] = 1
}
close(command)
print "not in IPBLOCK (will be appended):"
command = "date +'%Y-%m-%d %H:%M'"
command |getline DATE
close(command)
for(ip in IPTABLES) {
if(!IPBLOCK[ip]) {
print ip
print DATE,ip >> IPBLOCKFILE
}
}
print "not in IPTABLES (will be appended):"
# command = "echo iptables -A " CHAIN " -s " //use for testing
command = "iptables -A " CHAIN " -s "
for(ip in IPBLOCK) {
if(!IPTABLES[ip]) {
print ip
system(command ip " -j DROP")
}
}
exit
}
Doing 1&3:
comm -13 <(awk '{print $3}' /etc/ssh/ipblock | sort) <(iptables -nL somechain | awk '/\./{print $4}' | sort) | xargs -n 1 echo `date '+%y-%m-%d %H:%M'` >> /etc/ipblock
Doing 2&4:
comm -13 <(awk '{print $3}' /etc/ssh/ipblock | sort) <(iptables -nL somechain | awk '/\./{print $4}' | sort) | xargs -n 1 iptables -A somechain -d IP -j DROP
The command is constructed of the following building blocks:
Bash process substitution feature: it is somewhat similar to pipe features, but is often used when a program requires two or more input files in its arguments/options. Bash creates fifo file, which basically "contains" the output of a given command. In our case the output will be ip adresses.
Then output of awk scripts is passed to comm program, and both awk scripts are pretty simple: they just print ip address. In first case all ips are contained in third column(hence $3), and in the second case all ips are contained in the fourth column, but it is neccessary to get rid of column header("destination" string), so simple regex is used /\./: it filters out all string that doesn't contain a dot.
comm requires both inputs to be sorted, thus output of awk is sorted using sort
Now comm program receives both lists of ip addresses. When no options are given, it prints three columns: lines unique to FILE1, lines unique to FILE2, lines in both files. By passing -23 to it we get only lines unique to FILE1. Similarly, passing -13 makes it output lines unique to FILE2.
xargs is basically a "foreach" loop in bash, it executes a given command per each input line(thanks to -n 1). The second is pretty obvious(it is the desired iptables invocation). The second one isn't complicated too: it just makes date to output current time in proper format.

Resources