Shell script output to Console and Log on same line? - shell

Is it possible for me to cut these two lines of shell script into one line.
My two lines of code echo the same string but one goes into the Console and the other into a log file.
echo "Starting scriptr" `date '+%T'` >> script.log
echo "Starting script" `date '+%T'`
Thanks

Use tee:
echo "Starting scriptr" `date '+%T'` | tee script.log
In order to append to the log file, say tee -a
Quoting from man tee:
tee - read from standard input and write to standard output and files

Related

How to capture shell script output

I have an unix shell script. I have put -x in shell to see all the execution step. Now I want to capture these in one log file on a daily basis.
Psb script.
#!/bin/ksh -x
Logfile= path.log.date
Print " copying file" | tee $logifle
Scp -i key source destination | tee -a $logfile.
Exit 0;
First line of the shell script is known as shebang , which indicates what interpreter has to be execute the below script.
Similarly first line is commented which denotes coming lines not related to that interpreted session.
To capture the output, run the script redirect your output while running the script.
ksh -x scriptname >> output_file
Note:it will output what your script's doing line by line
There are two cases, using ksh as your shell, then you need to do IO redirection accordingly, and using some other shell and executing a .ksh script, then IO redirection could be done based on that shell. Following method should work for most of the shells.
$ cat somescript.ksh
#!/bin/ksh -x
printf "Copy file \n";
printf "Do something else \n";
Run it:
$ ./somescript.ksh 1>some.log 2>&1
some.log will contain,
+ printf 'Copy file \n'
Copy file
+ printf 'Do something else \n'
Do something else
In your case, no need to specify logfile and/or tee. Script would look something like this,
#!/bin/ksh -x
printf "copying file\n"
scp -i key user#server /path/to/file
exit 0
Run it:
$ ./myscript 1>/path/to/logfile 2>&1
2>&1 captures both stderr and stdout into stdout and 1>logfile prints it out into logfile.
I would prefer to explicitly redirecting the output (including stderr 2> because set -x sends output to stderr).
This keeps the shebang short and you don't have to cram the redirecton and filename-building into it.
#!/bin/ksh
logfile=path.log.date
exec >> $logfile 2>&1 # redirecting all output to logfile (appending)
set -x # switch on debugging
# now start working
echo "print something"

how to log several commands from a shell script to a log file

have several commands in a shell file that I would like to log and show on screen. but some commands results I want to NOT show on screen and some I want to - but all of them need to be logged.
I can use tee or > or >> etc..
script.sh | tee -a logfile
but that does not allow me to pick and choose what shows on screen and what goes into logs.
example script - what I have now (each line is different and looks inefficient)
echo "setting date" | tee log.txt #show on screen and log
`date` | tee -a log.txt # screen and log
echo "setting name" | tee -a log.txt #show on screen
`who am i` >> log.txt | only log
I have a several commands like this - and am wondering if there is a efficient way to append to log AND/OR append to log while showing on screen.
OR do I have to modify and make a call in each line ?
Users will not be able to modify this script.
When you do not want to tell with each command who you like it logged, decide what you want to use most of the time. Is it tee? Try this:
function doit {
echo "Normal (both)"
# Some more commands without explicit rediection for both strout and logfile
echo "only stdout" >&3
echo "Only logfile" >> $0.log
}
exec 3>&1
echo " === Screen output"
doit | tee -a $0.log
echo " === Content file"
cat $0.log
Output
=== Screen output
only stdout
Normal (both)
=== Content file
Only logfile
Normal (both)
thank you for your responses. after a lot of additional research - I decided to pick portions from the answer at How do I write stderr to a file while using "tee" with a pipe? - the post from Anthony.

Read full stdin until EOF when stdin comes from `cat` bash

I'm trying to read full stdin into a variable :
script.sh
#/bin/bash
input=""
while read line
do
echo "$line"
input="$input""\n""$line"
done < /dev/stdin
echo "$input" > /tmp/test
When I run ls | ./script.sh or mostly any other commands, it works fine.
However It doesn't work when I run cat | ./script.sh , enter my message, and then hit Ctrl-C to exit cat.
Any ideas ?
I would stick to the one-liner
input=$(cat)
Of course, Ctrl-D should be used to signal end-of-file.

Redirect stdin in a script to another process

Say I have a bash script that get some input via stdin. Now in that script I want to launch another process and have that process get the same data via its stdin.
#!/bin/bash
echo STDIN | somecommand
Now the "echo STDIN" thing above is obviously bogus, the question is how to do that? I could use read to read each line from stdin, append it into a temp file, then
cat my_temp_file | somecommand
but that is somehow kludgy.
When you write a bash script, the standard input is automatically inherited by any command within it that tries to read it, so, for example, if you have a script myscript.sh containing:
#!/bin/bash
echo "this is my cat"
cat
echo "I'm done catting"
And you type:
$ myscript.sh < myfile
You obtain:
this is my cat
<... contents of my file...>
I'm done catting
Can tee help you?
echo 123 | (tee >( sed s/1/a/ ) >(sed s/3/c/) >/dev/null )

Command prompt appearing during execution of a bash script

This is a simplified version of a script I use:
In its simplified version tt should read the file input line by line and then print it to the standard output and also write to a file log.
input file:
asas
haha
asha
hxa
The script (named simple):
#!/bin/bash
FILE=input
logfile="log"
exec > >(tee "$logfile") # redirect the output to a file but keep it on stdout
exec 2>&1
DONE=false
until $DONE; do
read || DONE=true
[[ ! $REPLY ]] && continue #checks if a line is empty
echo "----------------------"
echo $REPLY
done < "$FILE"
echo "----------------------"
echo ">>> Finished"
The output (on console):
-bash-3.2$ ./simple
-bash-3.2$ ----------------------
asas
----------------------
haha
----------------------
asha
----------------------
hxa
----------------------
>>> Finished
At this time I need to press enter to terminate the script. Notice that a command prompt -bash-3.2$ showed up during execution.
I checked that those lines are to blame:
exec > >(tee "$logfile") # redirect the output to a file but keep it on stdout
exec 2>&1
Without them the output is as expected:
-bash-3.2$ ./simple
----------------------
asas
----------------------
haha
----------------------
asha
----------------------
hxa
----------------------
>>> Finished
-bash-3.2$
What is more I don't need to press enter to terminate the script.
Unfortunately I need saving the output both to the console (stdout) and a log file.
How can this be fixed?
You can use tee on the echo lines directly.
For example:
[ben#lappy ~]$ echo "echo and save to log" | tee -a example.log
echo and save to log
[ben#lappy ~]$ cat example.log
echo and save to log
The -a argument to tee will append to the log.
If you just need it to pause and wait for user input use the pause command:
pause
What's happening is that tee "$logfile" is being run asynchronously. When you use process substitution like that, the main script doesn't wait for the process to exit.
So the until loop runs, the main script exits, the shell prints the prompt, and then tee prints its output.
You can see this more easily with:
echo Something > >(sleep 5; cat)
You'll get a command prompt, and then 5 seconds later Something will appear.
There was a thread about this behavior in comp.unix.shell a couple of years ago. See it here

Resources