I have a script which prompts user to select options like 'y' or 'n'.
If 'y' is selected, the script proceeds with further execution and if 'n' is selected then it stops.
I want the output of this file to be re-directed to a log file. so used below command:-
./script stop >> script_RUN.log 2>&1
The problem is, the script starts running but does not prompt to ask for options like 'y' or 'n'
It is writing this to script_RUN.log.
How can I make the script to prompt user for options and re-direct the further execution to script_RUN.log?
you can try using tee command instead.
./script stop | tee script_RUN.log
NOTE:
Only the output of the program will be saved.
EDIT:
if you don't want to see the output on the console at all just redirect it into /dev/null
for example:
./script stop | tee script_RUN.log > /dev/null
the above line will write the file into log but dost NOT printout on console
This works like it has to really. You are redirecting stdout and stderr output from the very start. Instead you should try to redirect it in the script after the prompt. I think this would be helpful for you:
redirect COPY of stdout to log file from within bash script itself
Related
In an interactive bash script I use
exec > >(tee -ia logfile.log)
exec 2>&1
to write the scripts output to a logfile. However, if I ask the user to input something this is not written to this file:
read UserInput
Also, I issue commands with $UserInput as parameter. These command are also not written to the logfile.
The logfile should contain everything my script does, i.e. what the user entered interactively and also the resulting commands along with their output.
Of course I could use set -x and/or echo "user input: "$UserInput, but this would also be sent to the "screen". I dont want to read anything else on the screen except what my script or the commands echo.
How can this be done?
I am trying to modify a script someone created for unix in shell. This script is mostly used to run on backed servers with no human interaction, however I needed to make another script to allow users to input information. So, it is just modifying to old version for user input. But the biggest issue I am running into is trying to get both error logs and echos to be saved in a log file. The script has a lot of them, but I wanted to have those shown on the terminal as well as send them to the log file specified, to be looked into later.
What I have is this:
exec 1> ${LOG} 2>&1
This line is pretty much send everything to the log file. That is all good, but I also have people trying to enter in information in the script, and it is sending everything to the log file including the echo needed for the prompt. This line is also at the beginning of the script, but reading more into the stderr and stdout messages. I tried:
exec 2>&1 1>>${LOG}
exec 1 | tee ${LOG} But only getting error when running it this "./bash_pam.sh: line 39: exec: 1: not found"
I have went over site such as this to solve the issue, but I am not understanding why it does not print to both. The way I insert it, it either only sends it to the log location and not to the terminal, or it sends it to the terminal, but nothing is persevered in the log.
EDIT: Some of the solutions, for this have mentioned that certain fixes will work in bash, but not in /bin/sh.
If you would like all output to be printed onto the console, while also being printed to a logfile.txt you would run this command on your script:
bash your_script.sh 2>&1 | tee -a logfile.txt
Or calling it within the file:
<bash_command> 2>&1 | tee -a logfile.txt
To append to logfile.txt instead of overwriting, add the -a option to tee.
I have a shell script that can enable ble device scan with the following command
timeout 10s hcitool lescan
By executing this script (say ble_scan), I can see the nearby devices shown on the terminal.
However, when I redirect it to the file and terminal
./ble_scan | tee test.log
I can't see the nearby devices shown on the screen anymore and log file as well.
./ble_scan 2>&1 | tee test.log
The above redirection also doesnt help, anything I go wrong here?
If the command behaves differently with file output, you can run it within script.
script test.log
#=> Script started, output file is test.log
./ble_scan
# lots of output here
exit
#=> Script done, output file is test.log
Note that the file will include terminal-specific characters like carriage returns not normally captured in output redirects.
i want to implement a shell-script that runs the command xyz and stores its output in a variable, but at the same time forwarding the commands output to the shell-scripts stdout.
This is because I want to launch this script via launchd, let it automatically log the script's output, but then also let the script push the individual commands output to the web. The script should not simply buffer the commands output and print it after it ran, but rather in real time.
Is something like this possible, and if, how do you implement it?
Thanks
thel30n
you are looking for the command:
$VAR=$(echo 'test' | tee /dev/tty)
test
echo $VAR
test
I believe there is no way to save log into a shell variable and avoid buffering command's output at the same time. But the alternative way would be saving log messages to a file using tee(1). For example:
LOGFILE=/path/to/logfile
run_and_log() {
$# | tee -a "$LOGFILE"
}
run_and_log xyz
I have a bash script that has set -x in it. Is it possible to redirect the debug prints of this script and all its output to a file? Ideally I would like to do something like this:
#!/bin/bash
set -x
(some magic command here...) > /tmp/mylog
echo "test"
and get the
+ echo test
test
output in /tmp/mylog, not in stdout.
This is what I've just googled and I remember myself using this some time ago...
Use exec to redirect both standard output and standard error of all commands in a script:
#!/bin/bash
logfile=$$.log
exec > $logfile 2>&1
For more redirection magic check out Advanced Bash Scripting Guide - I/O Redirection.
If you also want to see the output and debug on the terminal in addition to in the log file, see redirect COPY of stdout to log file from within bash script itself.
If you want to handle the destination of the set -x trace output independently of normal STDOUT and STDERR, see bash storing the output of set -x to log file.
the -x output goes to stderr, so to log it do:
set -x
exec 2>/tmp/mylog
To redirect stderr and stdout:
exec &>> $LOG_FILE_NAME
If you want to append to file. To overwrite file:
exec &> $LOG_FILE_NAME
In my case, the script was being called multiple times from elsewhere, and I wasn't seeing everything, so I did an append instead, and it worked:
exec 1>>FILENAME 2>&1
set -x
To avoid confusion, be sure to delete FILENAME before each run.