how to open a file after writing to it using Bash (in one line) - bash

is there a way to
open or cat a file that's been newly created e.g.
cat $(cat 1210.hc.vcf | head -n1 > ok.txt )
open $(cat 1210.hc.vcf | head -n1 > ok.txt )
Open a file after writing to it using Bash (in one line)

You can use tee to write to stdout and the file.
head -n 1 1210.hc.vcf | tee ok.txt

What comes in my mind:
cat 1210.hc.vcf | head -n1 > ok.txt; cat !$

Related

Bash Script - tail to file

I have the following in a bash script file watcher.sh.
grep ERROR $ExampleLogFile > $ErrorLogFile
When I run this, it copied the lines from ExampleLogFile to ErrorLogFile that contain ERROR successfully.
I need to make it so it continually monitors the ExampleLogFile for changes and writes those to the ErrorLogFile.
I was thinking of doing the following, but this doesn't work:
tail -f grep ERROR $ExampleLogFile > $ErrorLogFile
It does write some of the lines, but its not the ones containing ERROR.
tail: grep: No such file or directory
tail: ERROR: No such file or directory
Any advise please.
You can use tee command here.
tail -f $ExampleLogFile | grep --line-buffered ERROR | tee $ErrorLogFile
It will store and print to stdout at the same time.
You need:
while :; do grep ERROR $ExampleLogFile > $ErrorLogFile; sleep 2; done
This should achieve what you want without needing the tail command.
If the file will ever be cleared though this will not work as you might expect because the grep will pull only current entries in the $ErrorLogFile.
You can arrange the tail/grep in a pipe
tail -f $ExampleLogFile | grep ERROR > $ErrorLogFile
Remember that this command will never exit by itself (tail will continue to look for additional data). You will have to arrange for some other exit condition (e.g., timeout, explicit kill, etc).
tail -f $ExampleLogFile | grep --line-buffered ERROR > $ErrorLogFile
or paranoic:
stdbuf -oL tail -f $ExampleLogFile | stdbuf -oL grep --line-buffered ERROR > $ErrorLogFile
But most probably you want to include existing lines too. In that case:
tail -n +1 -f $ExampleLogFile | grep --line-buffered ERROR > $ErrorLogFile

Catch output of several piped commands

Till today I was always able to find answer for my all bash questions. But now I stuck. I am testing 20TB RAID6 configuration working on LSI 9265.
I wrote script to create files from /dev/urandom and I am creating second to calculate md5 from all files with two addons.
One is to use time command to calculate md5sum execution time
Second is use pv command to show progress of each md5sum command
My command looks like this:
filename="2017-03-13_12-38-08"
/usr/bin/time -f "real read %E" pv $filename | md5sum | sed "s/-/$filename /"
This is example terminal printout:
/usr/bin/time -f "real read %E" pv $i | md5sum | sed "s/-/$i/"
1GiB 0:00:01 [ 551MiB/s] [==================================================================================================>] 100%
real read 0:01.85
f561af8cc0927967c440fe2b39db894a 2017-03-13_12-38-08
And I want to log it to file. I failed all tries using 2>&1, using tee, using brackets. I know pv uses stdErr but this doesnt help in finding solution. I can only catch "f561af8cc0927967c440fe2b39db894a 2017-03-13_12-38-08_done"
which is not enough.
This is the solution:
(time pv -f $filename | md5sum | sed "s/-/$filename/") 2>&1 | tee output.log
or equivalent but without printing into terminal only file to output.log
(time pv -f $filename | md5sum | sed "s/-/$filename/") > output.log 2>&1

redirect stdin and stdout using tee and keep previous std

How can I both write to a file and display to screen using pipe with tee?
This command actually do it, the problem is that it writes to a new file and tail -f give me an error "truncate file".
ls -al | tee file.txt
-a option of tee is what you are looking for
-a, --append
append to the given FILEs, do not overwrite
so your line would be:
ls -al | tee -a file.txt

Using output of head -1

When I write
ls | head -1
the output is
file.txt
When I write
ls | head -1 > output.txt or
echo `ls | head -1` > output.txt
the file output.txt contains
^[[H^[[2Jfile.txt
This makes me trouble because I need to use the output of head -1 as an argument of another command.
How can I achieve this?
Possibly your ls is aliased to something like ls --color=always. Try /bin/ls | head -1 > output.txt
These are probably terminal escape codes for coloring. Your ls setup seems to be broken, normally coloring should only be done when connected directly to a terminal.
ls --color=never | head -1
should fix the issue.

bash output redirect prob

I want to count the number of lines output from a command in a bash script. i.e.
COUNT=ls | wc -l
But I also want the script to output the original output from ls. How to get this done? (My actual command is not ls and it has side effects. So I can't run it twice.)
The tee(1) utility may be helpful:
$ ls | tee /dev/tty | wc -l
CHANGES
qpi.doc
qpi.lib
qpi.s
4
info coreutils "tee invocation" includes this following example, which might be more instructive of tee(1)'s power:
wget -O - http://example.com/dvd.iso \
| tee >(sha1sum > dvd.sha1) \
>(md5sum > dvd.md5) \
> dvd.iso
That downloads the file once, sends output through two child processes (as started via bash(1) process substitution) and also tee(1)'s stdout, which is redirected to a file.
ls | tee tmpfile | first command
cat tmpfile | second command
Tee is a good way to do that, but you can make something simpler:
ls > __tmpfile
cat __tmpfile | wc -l
cat __tmpfile
rm __tmpfile

Resources