Redirection operator working in shell prompt but not not in script - shell

I have a file called out1.csv which contains tabular data.
When I run the command in the terminal it works:
cat out1.csv | grep -v ^$ | grep -v ^- > out2.csv
It reads the file and greps all lines except blanks and starting with - and redirects the output to out2.csv.
But when I put the same command in a script it does not work.
I have even tried echoing:
echo " `cat out1.csv | grep -v ^$ | grep -v ^- > out2.csv` " > out2.csv
I have also tried to specify full paths of the files. But no luck.
In the script, the command runs, but output is not redirected to the file as per debug mode.
What am I missing?

The issue wasn't of the script but of the sql script that this script was calling before this command. Both commands are actually proper.

You're redirecting twice
The command in backticks writes to file and prints nothing.
You take that nothing and write it to the file, overwriting what was there before.

One way to do it in script will be same as you do in console #BETTER WAY
cat out1.csv | grep -v ^$ | grep -v ^- > out2.csv # No need to echo it or put the command in backtick `
You are redirecting twice
The other way as you are trying is
echo " `cat out1.csv | grep -v ^$ | grep -v ^- > out2.csv` " # Don't redirect the output again to out2.csv

Related

Filtering command output and print to file?

I am currently launching this bash line command -option | grep -A 1 --color 'string1\|string2' to filter the output of a process. Instead of printing the filtered output on console, how can I print the output on file?
I tried: command -option | grep -A 1 'string1\|string2' >> test.txt but it didn't print anything on file.
I also tried by adding the regular expression option: command -option | grep -E -A 1 'string1|string2' >> test.txt but I still got an empty file.
Apparently the issue was with buffering. By buffering line by line the problem is solved.
command -option | grep --line-buffered -A 1 'string1\|string2' >> test.txt

Multiple output in single line Shell commend with pipe only

For example:
ls -l -d */ | wc -l | awk '{print $1}' | tee /dev/tty | ls -l
This shell command print the result of wc and ls -l with single line, but tee is used.
Is it possible to using one Shell commend line to achieve multiple output without using “&&” “||” “>” “>>” “<” “;” “&”,tee and temp file?
When you want the output of date and ls -rtl | head -1 on one line, you can use
echo "$(date): $(ls -rtl | head -1)"
Yes, you can achieve writing to multiple files with awk which is not in the list of things you appear not to like:
echo hi | awk '{print > "a.txt"; print > "b.txt"}'
Then check a.txt and b.txt.

Redirecting piped command into a file in bash

I'm trying to do the following:
ping some.server.com | grep -Po '(?<=\=)[0-9].\.[0-9]' >> file.dat
i.e. I run a command (ping), grep part of the output and redirect the result of grep into a file to be inspected later. While the command itself works (i.e. the part before '>>'), nothing gets written into the file.
How do I do this correctly?
Use --line-buffered argument.
ping some.server.com | grep --line-buffered -Po '(?<=\=)[0-9].\.[0-9]' >> file.dat

nslookup capture stderr in a variable and display

In a shell script I am running nslookup on number of URLs
Sometimes some url returns cannot resolv error. I need to capture those errors in a variable.
here is code for nslookup which gets ip address returned
output=$(nslookup "$URL" | grep Add | grep -v '#' | cut -f 3 -d ' ' | awk 'NR>1' )
Now in same variable output, I want to capture the error
nslookup: can't resolve
Stdout I am capturing in a file.
I have tried different version of re-directions - 2>&1 and others but error does not get assigned to variable. I do not want the error to be re-directed to separate file but want it to be recorded in above output variable.
As long as you are using awk, you can simplify things considerably
nslookup "$URL" 2>&1 |
awk -e '/Add/ && !/#/ && NR > 1 {print $2}'
-e '/resolve|NXDOMAIN/ { print "error" }'
Where one line has been broken into three for clarity. I cannot reproduce the problem you say you have 2&>1 nor do I believe it should fail.
The redirection of stderr works when you use
output=$(nslookup "$URL" 2>&1 | grep Add | grep -v '#' | cut -f 3 -d ' ' | awk 'NR>1')
but it is futile since you filter it out immediately with the grep Add. You need to rethink your logic and what you really want. Maybe a better approach is
output=$(nslookup "$URL" 2>&1)
case $output in
(nslookup:*) ;;
(*) output=$(echo "$output" | grep Add | ...);;
esac

redirect to /dev/null: No such file or directory

I would like to know if somebody could help me with this error:
wc: Files/Unused-CMA_host.txt: No such file or directory
The file doesn't exist, then I want to redirect the output to /dev/null
I try with this sentence > /dev/null 2>&1 which is working in other case, but not here:
wc -l Files/Unused-CMA_host.txt | awk '{print $1}' > /dev/null 2>&1
somebody know why?
thanks.
Redirections apply to individual components of a pipeline, not a pipeline as a whole. In your example, you only redirect awk's standard output. To redirect the standard error and stardard output of the entire pipeline would require a command group such as
{ wc -l Files/Unused-CMA_host.txt | awk '{print $1}' ; } > /dev/null 2>&1
However, if the file doesn't exist, there won't be any standard output. Perhaps you want to redirect standard error to suppress the error message? Then you can simply use
wc -l Files/Unused-CMA_host.txt 2> /dev/null | awk '{print $1}'
In the case of a non-existent file, awk will simply read from an empty stream and do nothing.

Resources