Bash output from expect script to two different files - bash

I am trying to output to two different files using tee. My first file will basically be tail -f /myfile and my second output will be a subset of the first file. I have looked online that they were saying we can use `|
tee >(proc1) >(proc2)
I have tried the above but both my files are blank.
Here is what i have so far:
myscript.sh
ssh root#server 'tail -f /my/dir/text.log' | tee >(/mydir/my.log) >(grep 'string' /mydir/my.log > /mydir/mysecond.log)
myexpect.sh
#!/usr/bin/expect -f
set pass password
spawn /my/dir/myexpect.sh
expect {
"key fingerprint" {send "yes/r"; exp_contiue}
"assword: " {send "$pass\r"}
}
interact

In your script, there are some problems in the usage of tee,
tee >(/mydir/my.log): can be substitute with tee /mydir/my.log, since tee would write to stdout and files, i.e. /mydir/my.log
grep 'string' /mydir/my.log > /mydir/mysecond.log: as I mentioned, tee would also write to stdout, so no need to grep the string from file, you can grep from stdout directly. Use pipeline to do it.
So the whole command shall be modified as followed,
ssh root#server 'tail -f /my/dir/text.log | tee /mydir/my.log | grep --line-buffered "string" > /mydir/mysecond.log'
Edit:
For your further question
The command would hang because of tail -f was still waiting for output the growing file. If you don't want the command hanged, try to remove -f for tail.
Depends on the option -f existed for tail, you shall use two different way to allow the grep write file.
For tail case: grep can successfully write file
For tail -f case: --line-buffered for grep would use line buffering on output

Related

Exclude specific lines from sftp connection from logfile

I have multiple scripts to connect to an sftp server and put/get files. Recently the sftp admins added a large header for all the logins, which has blown up my log files, and I'd like to exclude that as it's blowing up the file sizes as well as making the logs somewhat unreadable.
As is, the commands are of the following format:
sftp user#host <<EOF &>> ${LOGFILE}
get ...
put ...
exit
EOF
Now, I've tried grepping out all the new lines, which all start with a pipe (they basically made a box to put them in).
sftp user#host <<EOF | grep -v '^|' &>> ${LOGFILE}
get ...
put ...
exit
EOF
This excludes the lines from ${LOGFILE} but throws them to stdout instead which means they end up in another log file, where we also don't want them (these scripts are called by a scheduler). Oddly, it also seems to filter out the first line from the connection attempt output.
Connecting to <host>...
and redirect that to stdout as well. Not the end of the world, but I do find it odd.
How can I filter the lines beginning with | so they don't show anywhere?
In
sftp user#host <<EOF &>> ${LOGFILE}
You are redirecting stdout and stderr to the logfile for appending data (&>>). But when you use
sftp user#host <<EOF | grep -v '^|' &>> ${LOGFILE}
you are only redirecting stdout to grep, leaving the stderr output of sftp to pass untouched. Finally, you are redirecting agains stdout and stderr of grep to the logfile.
In fact, you are interested in redirecting both (stdout and stderr) from sftp, so you can use someting like:
sftp user#host <<EOF |& grep -v '^|' >> ${LOGFILE}
or, in older versions of bash, using the specific redirection instead of the shorthand:
sftp user#host <<EOF 2>&1 | grep -v '^|' >> ${LOGFILE}

Redirecting "tail -f" output to COPY command

I'm trying to redirect the output of the tail -f -n 1 to the Postgres COPY command, the requirement is to execute the COPY command for every output of the tail command.
Came out with the following:
tail -f -n 1 <source_file> | xargs -n 1 psql -c 'copy <table_name> from stdin'
but this is not working as the output of the tail command is used as a parameter for the psql command rather than as stdin.
Also the more generic:
tail -f -n 1 <source_file> | psql -tc "copy <table_name> from stdin"
is not working as the COPY command perform a commit at the end of the stream and not for every single row.
Realized that the issue is that COPY is not intended to accept streams and data is available only when COPY command is over (end of data from program), in this specific case program was never terminated as it was intended to work as a consumer.

tee command piped in a grep and redirected to file

I would like to use the following command in bash:
(while true; do date; sleep 1;done) | tee out.out 2>&1 | grep ^[A-Z] >log.log 2>&1 &
unfortunately, until it is finished (by killing the ppid of sleep command for example), the file log.log is empty but the file out.out has the expected content.
I first want to understand what's happening
I would like to fix this.
In order to fix this, you need to make grep line-buffered. This might depend on the implementation, but on BSD grep (shipped with Mac OS X), you simply need to add the --line-buffered option to grep:
(while true; do date; sleep 1;done) | tee out.out 2>&1 | grep --line-buffered ^[A-Z] >log.log 2>&1 &
From the grep man page:
--line-buffered
Force output to be line buffered. By default, output is line buffered when standard output is a terminal and block buffered otherwise.
You can actually validate that behavior by outputting to STDOUT instead:
(while true; do date; sleep 1;done) | tee out.out 2>&1 | grep ^[A-Z] 2>&1 &
In that case, you don't need to buffer by line explicitly, because that's the default. However, when you redirect to a file, you must explicitly set that behaviour.

redirect stdin and stdout using tee and keep previous std

How can I both write to a file and display to screen using pipe with tee?
This command actually do it, the problem is that it writes to a new file and tail -f give me an error "truncate file".
ls -al | tee file.txt
-a option of tee is what you are looking for
-a, --append
append to the given FILEs, do not overwrite
so your line would be:
ls -al | tee -a file.txt

Print STDOUT in the middle of 2 Pipes in Solaris(bash)

http://www.webdesignerdepot.com/rss.htm
I have the same issue. This command:
./somescript.sh > ../log/scriptlog.log
requires the output of a command go to std out. but inside the script
command | mailx -s "Subject" recipient#somedomain.tld
what I would like to do is something like :
command | tee > /dev/stdout | mailx -s "Subject" recipient#somedomain.tld
Where the output of the command goes to stdout( to be redirected into the ..log/scriptlog.log file )
and also into stdin for the mailx command.
Any way to do that?
tee already sends to stdout.
... | tee -a log/scriptlog.log | ...
exec 3>&1
command | tee /dev/fd/3 | mailx ...
or, using process substitution:
command | tee >(mailx ...)
I'll try process substitution. To clarifily, I have a cron'd shell script . The cron entry is similar to:
/usr/script/myscript.sh > /usr/log/myscript.log
inside the script is a line similar to:
command | mailx -s "Subject" recipient
Since stdout from 'command' is being piped into the mailx command, it does appear in the log file 'myscript.log', but I want it to.
I tried capturing it into a variable but the line feeds appear to be lost that way. I could use a temporary file, but I was hoping for something more elegant.

Resources