Using xargs with qmHandle to remove bounce messages [closed] - xargs

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I have almost 100,000 spam messages in my bounce folder for qmail.
I've been trying to use this command:
find * | xargs -tl `qmHandle -d$1`
But with no success. I've tried multiple variations. I also don't have parallel on my machine.
I did try:
find * | xargs qmHandle -d
But it puts a space between the resulting command of:
qmHandle -d 133893

Found it!
find * | xargs -tl -I {} qmHandle -d{}

Related

Cut all till the end [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I have an out put in the below pattern
["snaptuda-shv-22-lla1.example.com","snaptuza-shv-22-lla1.example.com","snaptuservice-proxy-shv-22-lla1.example.com"]
I used below command to strip the domains within the double quotes
cut -d"\"" -f2 file.txt
I got only the first domain , which was
snaptuda-shv-22-lla1.example.com
What I need is all domains till the end of the file , how can I achieve this ?
You input is json. For parsing json there is jq:
jq -r '.[]' filename
Or if the input comes from stdout, like this:
echo '["snaptuda-shv-22-lla1.example.com",...]' | jq -r '.[]'
snaptuda-shv-22-lla1.example.com
snaptuza-shv-22-lla1.example.com
snaptuservice-proxy-shv-22-lla1.example.com

bash shell execution [closed]

Closed. This question needs details or clarity. It is not currently accepting answers.
Want to improve this question? Add details and clarify the problem by editing this post.
Closed 2 years ago.
Improve this question
I am using
sed -s n v
Nothing works for me
The -i flag is only in GNU Sed.
cat file | tr ']' '[' > temp
mv temp file
The above should work for you.

How to get the count of same rows from two tab deliminated file with awk? [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 2 years ago.
Improve this question
I have two tab deliminated files
File1.tab
100 ABC
300 CDE
File2.tab
399 GSA
300 CDE
I want awk command to return 1 because row '300 CDE' is common in both file.
I almost hate to encourage laziness by answering a question with so little effort put into it, but did you try grep?
$: grep -c -f File1.tab File2.tab
1
If lines are unique per file you can use grep
grep -f File1.tab File2.tab | wc -l

Remove space in a file efficiently [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
We don’t allow questions seeking recommendations for books, tools, software libraries, and more. You can edit the question so it can be answered with facts and citations.
Closed 7 years ago.
Improve this question
I have a very big file and I want to remove the space character in the file.
The 'sed' can be used but it is very slow.
Is there any command that can use fixed string instead of regexto replace or remove space.
You can use tr command (see manpage for more information):
cat filename | tr -d "\t\n\r"
It has option to define character class also. Eg:
To delete all whitespace:
cat filename | tr -d "[:space:]"
To delete all horizontal whitespace:
cat filename | tr -d "[:blank:]"

Handling a special case during tail -f logging [closed]

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
I am tailing the logs to find if there is any Exception as show below
tail -f flexi.log | grep "Exception" --color
This works fine , but unfortanely i dont want to log it incase if there s any DataNotAvailableException .
The DataNotAvailableException comes frequently and i dont want to log that .
Is this possible ??
Just add another grep to search and remove the DataNotAvailables.
tail -f flexi.log | grep "Exception" --color | grep -v "DataNotAvailableException"

Resources