Cannot log terminal output using complex cat command, error "input file is output file" - bash

I'm trying to write a shell script having the below command and then log the output to a file:
cat "$(ls -rt | tail -n1)" >> Run.log
but getting an error,
Run.log : input file is the output file
This command works fine directly in the terminal and prints the latest file output in the current directory.
What am I doing wrong in the shell script?

Related

Batch (Windows) alternative for Bash | tee -a log.txt

I need to get stdout echoed onto the command prompt and appended into a file.
I have tried echo "foo" | tee -a log.txt. I have looked on google, however these is nothing there that is relevant.
"foo" | tee -a log.txt
It should echo foo and append it to a file. Instead I get
'tee' is not recognized as an internal or external command,
operable program or batch file.
I don't want the command tee, I need to get stdout echoed onto the command prompt
Pass what your trying into power shell, like this:
powershell "echo foo | tee -a foo.txt"

How to save command output to a file in windows and show output on terminal also?

I want to save output of a command in a file as well as display output on terminal.
For saving output to a file I am using following command:
>> ps -ef > D:\temp.txt
But this saves output into files but do not display on terminal. Which command should I use to perform both requirements?
Use MinGW, and than use the following command:
ps -ef 2>&1 | tee D:\temp.txt
This will work.

Issue with scheduling in Linux

I scheduled a script using at scheduler in linux.
The job ran fine but the echo statements which I had redirected to a file are no where to be found.
The at scheduling command is as follows:
at -f /app/data/scripts/func_test.sh >> /app/data/log/log.txt 2>&1 -v 09:50
Can anyone point out what is the issue with the above command.
I cannot see any echo statements from the script in the log.txt file
To include shell syntax like I/O redirection, you'll need to either fold it into your script, or pass the input to at via standard input, like so:
at -v 09:50 <<EOF
sh /app/data/scripts/func_test.sh >> /app/data/log/log.txt 2>&1
EOF
If func_test.sh is already executable, you can omit the sh from the beginning of the command; it's there to ensure that you are passing a valid command line to at.
You can also simply ensure that your script itself redirects all its output to a specific log file. As an example,
#!/bin/bash
echo foo
echo bar
becomes
#!/bin/bash
{
echo foo
echo bar
} >> /app/data/log/log.txt 2>&1
Then you can simply run your script with at using
at -f /app/data/scripts/func_test.sh -v 09:50
with no output redirection, because the script itself already redirects all its output to that file.

how to log all the command output to one single file in bash scripting [duplicate]

This question already has an answer here:
Output bash script into file (without >> )
(1 answer)
Closed 9 years ago.
In gnu/Linux i want to log all the command output to one particular file.
Say in terminal,i am typing
echo "Hi this is a dude"
It should print in the file name specified earlier without using the redirection in every command.
$ script x1
Script started, file is x1
$ echo "Hi this is a dude"
Hi this is a dude
$ echo "done"
done
$ exit
exit
Script done, file is x1
Then, the contents of file x1 are:
Script started on Thu Jun 13 14:51:29 2013
$ echo "Hi this is a dude"
Hi this is a dude
$ echo "done"
done
$ exit
exit
Script done on Thu Jun 13 14:51:52 2013
You can easily edit out your own commands and start/end lines using basic shell scripting (grep -v, especially if your Unix prompt has a distinctive substring pattern)
Commands launched from the shell inherit the file descriptor to use for standard output from the shell. In your typical interactive shell, standard output is the terminal. You can change that by using the exec command:
exec > output.txt
Following that command, the shell itself will write its standard output to a file called output.txt, and any command it spawns will do likewise, unless otherwise redirected. You can always "restore" output to the terminal using
exec > /dev/tty
Note that your shell prompt and text you type at the prompt continue to be displayed on the screen (since the shell writes both of those to standard error, not standard output).
{ command1 ; command2 ; command3 ; } > outfile.txt
Output redirection can be achieved in bash with >: See this link for more info on bash redirection.
You can run any program with ported output and all its output will go to a file, for example:
$ ls > out
$ cat out
Desktop
Documents
Downloads
eclipse
Firefox_wallpaper.png
...
So, if you want to open a new shell session with ported output, just do so!:
$ bash > outfile
will start a new bash session porting all of stdout to that file.
$ bash &> outfile
will port all of stdout AND stderr to that file (meaning you will no longer see prompts show up in your terminal)
For example:
$ bash > outfile
$ echo "hello"
$ echo "this is an outfile"
$ cd asdsd
bash: cd: asdsd: No such file or directory
$ exit
exit
$ cat outfile
hello
this is an outfile
$
$ bash &> outfile
echo "hi"
echo "this saves everythingggg"
cd asdfasdfasdf
exit
$ cat outfile
hi
this saves everythingggg
bash: line 3: cd: asdfasdfasdf: No such file or directory
$
If you want to see the output and have it written to a file (say for later analysis) then you can use the tee command.
$ echo "hi this is a dude" | tee hello
hi this is a dude
$ ls
hello
$ cat hello
hi this is a dude
tee is a useful command because it allows you to store everything that goes into it as well as displaying it on the screen. Particularly useful for logging the output of scripts.

Bash Redirect to a file

I am trying to redirect output of a command to a file. The command I am using (zypper) downloads packages from the internet. The command I am using is
zypper -x -n in geany >> log.txt
The command gradually prints output to the console. The problem I am facing is that the above command writes the command output all at once after the command finishes executing. How do I redirect the bash output as I get it onto the terminal, rather than writing all the command output at the end.
Not with bash itself, but via the tee command:
zipper -x -n in geany | tee log.txt
&>>FILE COMMAND
will append the output of COMMAND to FILE
In your case
&>>log.txt zypper -x -n in geany
If you want to pipe a command through a filter, you must assure that the command outputs to standard output (file descriptor 1) -- if it outputs to standard error (file descriptor 2), you have to redirect the 2 to 1 before the pipe. Take into account that only stdout passed through a pipe.
So you have to do so:
2>&1 COMMAND | FILTER
If you want to grep the output and in the same keep it into a log file, you have to duplicate it with tee, and use a filter like ... | tee log-file | grep options

Resources