Bash script builds correct $cmd but fails to execute complex stream - bash

This short script scrapes some log files daily to create a simple extract. It works from the command line and when I echo the $cmd and copy/paste, it also works. But it will breaks when I try to execute from the script itself.
I know this is a nightmare of patterns that I could probably improve, but am I missing something simple to just execute this correctly?
#!/bin/bash
priorday=$(date --date yesterday +"%Y-%m-%d")
outputfile="/home/CCHCS/da14/$priorday""_PROD_message_processing_times.txt"
cmd="grep 'Processed inbound' /home/rules/care/logs/RootLog* | cut -f5,6,12,16,18 -d\" \" | grep '^"$priorday"' | sed 's/\,/\./' | sed 's/ /\t/g' | sed -r 's/([0-9]+\-[0-9]+\-[0-9]+)\t/\1 /' | sed 's/ / /g' | sort >$outputfile"
printf "command to execute:\n"
echo $cmd
printf "\n"
$cmd
ouput:
./make_log_extract.sh command to execute: grep 'Processed inbound' /home/rules/care/logs/RootLog.log /home/rules/care/logs/RootLog.log.1
/home/rules/care/logs/RootLog.log.10
/home/rules/care/logs/RootLog.log.11
/home/rules/care/logs/RootLog.log.12
/home/rules/care/logs/RootLog.log.2
/home/rules/care/logs/RootLog.log.3
/home/rules/care/logs/RootLog.log.4
/home/rules/care/logs/RootLog.log.5
/home/rules/care/logs/RootLog.log.6
/home/rules/care/logs/RootLog.log.7
/home/rules/care/logs/RootLog.log.8
/home/rules/care/logs/RootLog.log.9 | cut -f5,6,12,16,18 -d" " | grep
'^2014-01-30' | sed 's/\,/./' | sed 's/ /\t/g' | sed -r
's/([0-9]+-[0-9]+-[0-9]+)\t/\1 /' | sed 's/ / /g' | sort
/home/CCHCS/da14/2014-01-30_PROD_message_processing_times.txt
grep: 5,6,12,16,18: No such file or directory

As grebneke comments, do not store the command and then execute it.
What you can do is to execute it but firstly print it: Bash: Print each command before executing?
priorday=$(date --date yesterday +"%Y-%m-%d")
outputfile="/home/CCHCS/da14/$priorday""_PROD_message_processing_times.txt"
set -o xtrace # <-- set printing mode "on"
grep 'Processed inbound' /home/rules/care/logs/RootLog* | cut -f5,6,12,16,18 -d\" \" | grep '^"$priorday"' | sed 's/\,/\./' | sed 's/ /\t/g' | sed -r 's/([0-9]+\-[0-9]+\-[0-9]+)\t/\1 /' | sed 's/ / /g' | sort >$outputfile"
set +o xtrace # <-- revert to normal

Related

Grep Spellchecker

I am trying to write a simple shell script that takes a text file as input and checks all non-punctuated words against a dictionary (english.txt). It should return all non-matching (misspelled) words. I am using grep but it does not seem to successfully match all the lines in english.txt. I have included my code below.
#!/bin/bash
cat $1 |
tr ' \t' '\n\n' |
sed -e "/'/d" |
tr -d '[:punct:]' |
tr -cd '[:alpha:]\n' |
sed -e "/^$/d" |
grep -v -i -w -f english.txt

How to use cat in a pipe

I have the following command:
httpd.conf | grep AuthUserFile | cut -d" " -f4 | sed -e 's|["'\'']||g'
the output of this is:
/etc/httpd/secure/htpasswd.training
I did:
httpd.conf | grep AuthUserFile | cut -d" " -f4 | sed -e 's|["'\'']||g'| cat
However this just returned:
/etc/httpd/secure/htpasswd.training
I want to cat the contents of the file. How do I do this?
Piping to xargs cat will pass stdin as an argument to cat, printing the file.
Alternatively try: cat $( some command printing a filename ).
You could try surrounding your initial command with backticks as the argument to cat, like so:
cat `httpd.conf | grep AuthUserFile | cut -d" " -f4 | sed -e 's|["'\'']||g'`

echo -e cat: argument line too long

I have bash script that would merge a huge list of text files and filter it. However I'll encounter 'argument line too long' error due to the huge list.
echo -e "`cat $dir/*.txt`" | sed '/^$/d' | grep -v "\-\-\-" | sed '/</d' | tr -d \' | tr -d '\\\/<>(){}!?~;.:+`*-_ͱ' | tr -s ' ' | sed 's/^[ \t]*//' | sort -us -o $output
I have seen some similar answers here and i know i could rectify it using find and cat the files 1st. However, i would i like to know what is the best way to run a one liner code using echo -e and cat without breaking the code and to avoid the argument line too long error. Thanks.
First, with respect to the most immediate problem: Using find ... -exec cat -- {} + or find ... -print0 | xargs -0 cat -- will prevent more arguments from being put on the command line to cat than it can handle.
The more portable (POSIX-specified) alternative to echo -e is printf '%b\n'; this is available even in configurations of bash where echo -e prints -e on output (as when the xpg_echo and posix flags are set).
However, if you use read without the -r argument, the backslashes in your input string are removed, so neither echo -e nor printf %b will be able to process them later.
Fixing this can look like:
while IFS= read -r line; do
printf '%b\n' "$line"
done \
< <(find "$dir" -name '*.txt' -exec cat -- '{}' +) \
| sed [...]
grep -v '^$' $dir/*.txt | grep -v "\-\-\-" | sed '/</d' | tr -d \' \
| tr -d '\\\/<>(){}!?~;.:+`*-_ͱ' | tr -s ' ' | sed 's/^[ \t]*//' \
| sort -us -o $output
If you think about it some more you can probably get rid of a lot more stuff and turn it into a single sed and sort, roughly:
sed -e '/^$/d' -e '/\-\-\-/d' -e '/</d' -e 's/\'\\\/<>(){}!?~;.:+`*-_ͱ//g' \
-e 's/ / /g' -e 's/^[ \t]*//' $dir/*.txt | sort -us -o $output

" echo" in bash script empty file

I have problem with echo command I need export data to csv but its empty file
#!/bin/bash
while read domain
do
ownname= whois $domain | grep -A 1 -i "Administrative Contact:" |cut -d ":" -f 2 | sed 's/ //' | sed -e :a -e '$!N;s/ \n/,/;ta'
echo -e "$ownname" >> test.csv
done < dom.txt
You need to use command substitution to store command's output in a shell variable:
#!/bin/bash
while read domain; do
ownname=$(whois $domain | grep -A 1 -i "Administrative Contact:" |cut -d ":" -f 2 | sed 's/ //' | sed -e :a -e '$!N;s/ \n/,/;ta')
echo -e "$ownname" >> test.csv
done
PS: I haven't tested all the piped commands.

Use each line of piped output as parameter for script

I have an application (myapp) that gives me a multiline output
result:
abc|myparam1|def
ghi|myparam2|jkl
mno|myparam3|pqr
stu|myparam4|vwx
With grep and sed I can get my parameters as below
myapp | grep '|' | sed -e 's/^[^|]*//' | sed -e 's/|.*//'
But then want these myparamx values as paramaters of a script to be executed for each parameter.
myscript.sh myparam1
myscript.sh myparam2
etc.
Any help greatly appreciated
Please see xargs. For example:
myapp | grep '|' | sed -e 's/^[^|]*//' | sed -e 's/|.*//' | xargs -n 1 myscript.sh
May be this can help -
myapp | awk -F"|" '{ print $2 }' | while read -r line; do /path/to/script/ "$line"; done
I like the xargs -n 1 solution from Dark Falcon, and while read is a classical tool for such kind of things, but just for completeness:
myapp | awk -F'|' '{print "myscript.sh", $2}' | bash
As a side note, speaking about extraction of 2nd field, you could use cut:
myapp | cut -d'|' -f 1 # -f 1 => second field, starting from 0

Resources