awk remote ssh command - bash

I have a bash script that run a remote awk command but I guess I haven't correctly escape specials characters since no file is generated on the remote server. Still I have no error.
My variables are declared locally and can be used remotely without issue (other part of the script confirm this).
ssh -q -t server '
logfiles=$(find /var/log/httpd/ -type f -name *access_log -cmin -'"$past"')
for log in $logfiles;
awk -vDate=\`date -d'now-'"$past"' minutes' +[%d/%b/%Y:%H:%M:%S\` ' { if \(\$4 > Date\) print \$0}' $log | sort |uniq -c |sort -n | tail | cut -d " " -f 11,15,16
done
'
Thank you!
EDIT1:
passing this script
#!/bin/bsh
logfiles=$(find /var/log/httpd/ -type f -name *access_log -cmin -120)
for log in $logfiles; do
awk -vDate=`date -d'now-120 minutes' +[%d/%b/%Y:%H:%M:%S` ' { if ($4 > Date) print $0}' $log | sort |uniq -c |sort -n | tail | cut -d " " -f 11,15,16 > /root/httpd.log;
done
like this works
ssh user#host < script.sh
When I run the same script from the console :
ssh -q -t $apache '
logfiles=$(find /var/log/httpd/ -type f -name *access_log -cmin -120)
for log in $logfiles; do
awk -vDate=`date -d'now-120 minutes' +[%d/%b/%Y:%H:%M:%S` ' { if ($4 > Date) print $0}' $log | sort |uniq -c |sort -n | tail | cut -d " " -f 11,15,16 > /root/httpd.log;
done'
-bash: syntax error near unexpected token `('
So I tried to escape the parenthesis
ssh -q -t $apache '
logfiles=$(find /var/log/httpd/ -type f -name *access_log -cmin -120)
for log in $logfiles; do
awk -vDate=`date -d'now-120 minutes' +[%d/%b/%Y:%H:%M:%S` ' { if \($4 > Date\) print $0}' /var/log/httpd/royalcanin_com.access_log | sort |uniq -c |sort -n | tail | cut -d " " -f 11,15,16 > /root/httpd.log;
done'
but then nothing is generated.
EDIT2:
Having the file generated on the server but empty with this:
awk -vDate=\`date -d'now-120 minutes' +[%d/%b/%Y:%H:%M:%S\` ' { if '"($4 > Date)"' print $0}' $log | sort |uniq -c |sort -n | tail | cut -d " " -f 11,15,16 > /root/httpd.log;done'

You also need to escape your single quotes. For example, the first script…
ssh -q -t server '
logfiles=$(find /var/log/httpd/ -type f -name *access_log -cmin -'"$past"')
for log in $logfiles;
awk -vDate=\`date -d'now-'"$past"' minutes' +[%d/%b/%Y:%H:%M:%S\` ' { if \(\$4 > Date\) print \$0}' $log | sort |uniq -c |sort -n | tail | cut -d " " -f 11,15,16
done
'
… must really be written as:
ssh -q -t server '
logfiles=$(find /var/log/httpd/ -type f -name *access_log -cmin -'\''"$past"'\'')
for log in $logfiles;
awk -vDate=\`date -d'\''now-'\''"$past"'\'' minutes'\'' +[%d/%b/%Y:%H:%M:%S\` '\'' { if \(\$4 > Date\) print \$0}'\'' $log | sort |uniq -c |sort -n | tail | cut -d " " -f 11,15,16
done
'

Related

How to return an MD5 and SHA1 value for multiple files in a directory using BASH

I am creating a BASH script to take a directory as an argument and return to std out a list of all files in that directory with both the MD5 and SHA1 value of the files present in that directory. The only files I'm interested in are those between 100 and 500K. So far I gotten this far. (Section of Script)
cd $1 &&
find . -type f -size +100k -size -500k -printf '%f \t %s \t' -exec md5sum {} \; |
awk '{printf "NAME:" " " $1 "\t" "MD5:" " " $3 "\t" "BYTES:" "\t" $2 "\n"}'
I'm getting a little confused when adding the Sha1 and obviously leaving something out.
Can anybody suggest a way to achieve this.
Ideally I'd like the script to format in the following way
Name Md5 SHA1
(With the relevant fields underneath)
Your awk printf bit is overly complicated. Try this:
find . -type f -printf "%f\t%s\t" -exec md5sum {} \; | awk '{ printf "NAME: %s MD5: %s BYTES: %s\n", $1, $3, $2 }'
Just read line by line the list of files outputted by find:
find . -type f |
while IFS= read -r l; do
echo "$(basename "$l") $(md5sum <"$l" | cut -d" " -f1) $(sha1sum <"$l" | cut -d" " -f1)"
done
It's better to use a zero separated stream:
find . -type f -print0 |
while IFS= read -r -d '' l; do
echo "$(basename "$l") $(md5sum <"$l" | cut -d" " -f1) $(sha1sum <"$l" | cut -d" " -f1)"
done
You could speed up something with xargs and multiple processes with -P option to xargs:
find . -type f -print0 |
xargs -0 -n1 sh -c 'echo "$(basename "$1") $(md5sum <"$1" | cut -d" " -f1) $(sha1sum <"$1" | cut -d" " -f1)"' --
Consider adding -maxdepth 1 to find if you are not interested in files in subdirectories recursively.
It's easy from xargs to go to -exec:
find . -type f -exec sh -c 'echo "$1 $(md5sum <"$1" | cut -d" " -f1) $(sha1sum <"$1" | cut -d" " -f1)"' -- {} \;
Tested on repl.
Add those -size +100k -size -500k args to find to limit the sizes.
The | cut -d" " -f1 is used to remove the - that is outputted by both md5sum and sha1sum. If there are no spaces in filenames, you could run a single cut process for the whole stream, so it should be slightly faster:
find . -type f -print0 |
xargs -0 -n1 sh -c 'echo "$(basename "$1") $(md5sum <"$1") $(sha1sum <"$1")"' -- |
cut -d" " -f1,2,5
I also think that running a single md5sum and sha1sum process maybe would be faster rather then spawning multiple separate processes for each file, but such method needs storing all the filenames somewhere. Below a bash array is used:
IFS=$'\n' files=($(find . -type f))
paste -d' ' <(
printf "%s\n" "${files[#]}") <(
md5sum "${files[#]}" | cut -d' ' -f1) <(
sha1sum "${files[#]}" | cut -d' ' -f1)
Your find is fine, you want to join the results of two of those, one for each hash. The command for that is join, which expects sorted inputs.
doit() { find -type f -size +100k -size -500k -exec $1 {} + |sort -k2; }
join -j2 <(doit md5sum) <(doit sha1sum)
and that gets you the raw data in sane environments. If you want pretty data, you can use the column utility:
join -j2 <(doit md5sum) <(doit sha1sum) | column -t
and add nice headers:
(echo Name Md5 SHA1; join -j2 <(doit md5sum) <(doit sha1sum)) | column -t
and if you're in an unclean environment where people put spaces in file names, protect against that by subbing in tabs for the field markers:
doit() { find -type f -size +100k -size -500k -exec $1 {} + \
| sed 's, ,\t,'| sort -k2 -t$'\t' ; }
join -j2 -t$'\t' <(doit md5sum) <(doit sha1sum) | column -ts$'\t'

I want my script to echo "$1" into a file literally

This is part of my script
#!/bin/bash
echo "ls /SomeFolder | grep $1 | xargs cat | grep something | grep .txt | awk '{print $2}' | sed 's/;$//';" >> script2.sh
This echos everything nicely into my script except $1 and $2. Instead of that it outputs the input of those variables but i want it to literally read "$1" and "$2". Help?
Escape it:
echo "ls /SomeFolder | grep \$1 | xargs cat | grep something | grep .txt | awk '{print \$2}' | sed 's/;\$//';" >> script2.sh
Quote it:
echo "ls /SomeFolder | grep "'$'"1 | xargs cat | grep something | grep .txt | awk '{print "'$'"2}' | sed 's/;"'$'"//';" >> script2.sh
or like this:
echo 'ls /SomeFolder | grep $1 | xargs cat | grep something | grep .txt | awk '\''{print $2}'\'' | sed '\''s/;$//'\'';' >> script2.sh
Use quoted here document:
cat << 'EOF' >> script2.sh
ls /SomeFolder | grep $1 | xargs cat | grep something | grep .txt | awk '{print $2}' | sed 's/;$//';
EOF
Basically you want to prevent expansion, ie. take the string literaly. You may want to read bashfaq quotes
First, you'd never write this (see https://mywiki.wooledge.org/ParsingLs, http://porkmail.org/era/unix/award.html and you don't need greps+seds+pipes when you're using awk):
ls /SomeFolder | grep $1 | xargs cat | grep something | grep .txt | awk '{print $2}' | sed 's/;$//'`
you'd write this instead:
find /SomeFolder -mindepth 1 -maxdepth 1 -type f -name "*$1*" -exec \
awk '/something/ && /.txt/{sub(/;$/,"",$2); print $2}' {} +
or if you prefer using print | xargs instead of -exec:
find /SomeFolder -mindepth 1 -maxdepth 1 -type f -name "*$1*" -print0 |
xargs -0 awk '/something/ && /.txt/{sub(/;$/,"",$2); print $2}'
and now to append that script to a file would be:
cat <<'EOF' >> script2.sh
find /SomeFolder -mindepth 1 -maxdepth 1 -type f -name "*$1*" -print0 |
xargs -0 awk '/something/ && /.txt/{sub(/;$/,"",$2); print $2}'
EOF
Btw, if you want the . in .txt to be treated literally instead of as a regexp metachar meaning "any character" then you should be using \.txt instead of .txt.

Perform a CAT in FOR and SSH

I do not have much experience in shell script, therefore I need your help. I have the following query, I need to make a CAT to the files that I list, but I have not managed to know where to place the command. Thank you:
read date
echo -e "RECORDINGS"
for e in $Rec
do
sshpass -p password ssh user#server find $e "-type f -mtime -10 -exec ls -gGh --full-time {} \;" | cut -d ' ' -f 4,7 | grep $date | awk -F " " '{print $2}'
done
Ignoring that much of the code here is outright dangerous --
find_on_server() {
local e_q
printf -v e_q '%q ' "$1"
sshpass -p password ssh user#server "bash -s $e_q" <<'EOF'
e=$1
find "$e" -type f -mtime -10 -exec ls -gGh --full-time {} \;
EOF
# ^^^ the above line MUST NOT BE INDENTED
}
find_on_server "$e" | cut -d ' ' -f 4,7 | grep $date | awk -F " " '{print $2}'

To get \n instead of n in echo -e command in shell script

I am trying to get the output for the echo -e command as shown below
Command used:
echo -e "cd \${2}\nfilesModifiedBetweenDates=\$(find . -type f -exec ls -l --time-style=full-iso {} \; | awk '{print \$6,\$NF}' | awk '{gsub(/-/,\"\",\$1);print}' | awk '\$1>= '$fromDate' && \$1<= '$toDate' {print \$2}' | tr \""\n"\" \""\;"\")\nIFS="\;" read -ra fileModifiedArray <<< "\$filesModifiedBetweenDates"\nfor fileModified in \${fileModifiedArray[#]}\ndo\n egrep -w "\$1" "\$fileModified" \ndone"
cd ${2}
Expected output:
cd ${2}
filesModifiedBetweenDates=$(find . -type f -exec ls -l --time-style=full-iso {} \; | awk '{print $6,$NF}' | awk '{gsub(/-/,"",$1);print}' | awk '$1>= '20140806' && $1<= '20140915' {print $2}' | tr "\n" ";")
IFS=; read -ra fileModifiedArray <<< $filesModifiedBetweenDates
for fileModified in ${fileModifiedArray[#]}
do
egrep -w $1 $fileModified
done
Original Ouput:
cd ${2}
filesModifiedBetweenDates=$(find . -type f -exec ls -l --time-style=full-iso {} \; | awk '{print $6,$NF}' | awk '{gsub(/-/,"",$1);print}' | awk '$1>= '20140806' && $1<= '20140915' {print $2}' | tr "n" ";")
IFS=; read -ra fileModifiedArray <<< $filesModifiedBetweenDates
for fileModified in ${fileModifiedArray[#]}
do
egrep -w $1 $fileModified
done
How can i handle "\" in this ?
For long blocks of text, it's much simpler to use a quoted here document than trying to embedded a multi-line string into a single argument to echo or printf.
cat <<"EOF"
cd ${2}
filesModifiedBetweenDates=$(find . -type f -exec ls -l --time-style=full-iso {} \; | awk '{print $6,$NF}' | awk '{gsub(/-/,"",$1);print}' | awk '$1>= '20140806' && $1<= '20140915' {print $2}' | tr "\n" ";")
IFS=; read -ra fileModifiedArray <<< $filesModifiedBetweenDates
for fileModified in ${fileModifiedArray[#]}
do
egrep -w $1 $fileModified
done
EOF
You'd better use printf to have a better control:
$ printf "tr %s %s\n" '"\n"' '";"'
tr "\n" ";"
As you see, we indicate the parameters within double quotes: printf "text %s %s" and then we define what content should be stored in this parameters.
In case you really have to use echo, then escape the \:
$ echo -e 'tr "\\n" ";"'
tr "\n" ";"
Interesting read: Why is printf better than echo?

Find TXT files and show Total Count of records of each file and Size of each file

I need to find row Count and size of each TXT files.
It needs to search all the directories and just show result as :
FileName|Cnt|Size
ABC.TXT|230|23MB
Here is some code:
v_DIR=$1
echo "the directory to cd is "$1
x=`ls -l $0 | awk '{print $9 "|" $5}'`
y=`awk 'END {print NR}' $0`
echo $x '|' $y
Try something like
find -type f -name '*.txt' -exec bash -c 'lines=$(wc -l "$0" | cut -d " " -f1); size=$(du -h "$0" | cut -f1); echo "$0|$lines|$size"' {} \;

Resources