When I run this in terminal:
cd 1st_flask_app_1/
python3 app.py
I get the output:
* Running on http://127.0.0.1:5000/ (Press CTRL+C to quit)
When I try to run the same command in a Jupyter notebook with the cell magic %%bash, I get no printed output, but the web app still starts up and I can visit it. If I then stop the cell I get the output:
Process is interrupted.
So it looks like the Jupyter notebook with %%bash cell magic is printing final command outputs, but not intermediate outputs. Is there any way to also print the intermediate outputs?
For some reason it looks like your stdout/stderr are not properly attached to your bash...
Could you try to redirect your command stdout and stderr to a log file in append mode?
python3 app.py >> application.log 2>&1
or you could also pipe it with tee
python3 app.py 2>&1 | tee -a application.log
I hope this helps you!
Related
I am trying to modify a script someone created for unix in shell. This script is mostly used to run on backed servers with no human interaction, however I needed to make another script to allow users to input information. So, it is just modifying to old version for user input. But the biggest issue I am running into is trying to get both error logs and echos to be saved in a log file. The script has a lot of them, but I wanted to have those shown on the terminal as well as send them to the log file specified, to be looked into later.
What I have is this:
exec 1> ${LOG} 2>&1
This line is pretty much send everything to the log file. That is all good, but I also have people trying to enter in information in the script, and it is sending everything to the log file including the echo needed for the prompt. This line is also at the beginning of the script, but reading more into the stderr and stdout messages. I tried:
exec 2>&1 1>>${LOG}
exec 1 | tee ${LOG} But only getting error when running it this "./bash_pam.sh: line 39: exec: 1: not found"
I have went over site such as this to solve the issue, but I am not understanding why it does not print to both. The way I insert it, it either only sends it to the log location and not to the terminal, or it sends it to the terminal, but nothing is persevered in the log.
EDIT: Some of the solutions, for this have mentioned that certain fixes will work in bash, but not in /bin/sh.
If you would like all output to be printed onto the console, while also being printed to a logfile.txt you would run this command on your script:
bash your_script.sh 2>&1 | tee -a logfile.txt
Or calling it within the file:
<bash_command> 2>&1 | tee -a logfile.txt
To append to logfile.txt instead of overwriting, add the -a option to tee.
I'd like to understand why when I execute the following command in my terminal it works, but when I run through a script it doesn't
the command when I run it in my terminal
./tparente & ps --no-headers -C tparente -o rss,vsz >> "mem_results"
The mem_result file has the rss and vsz written in it.
The command when I run it through my terminal is slightly modified, it is written like this:
sh ~/Documents/tparente & ps --no-headers -C tparente -o rss,vsz >> "mem_results"
There's an echo command that write some text in mem_results before the aforementioned command, those works.
And if I remove the no header flag, it writes the header in the file but not the result.
I know the script is run, because it produce a file at the end.
This has been bugging me for a couple hours now.
Thank you
Alex.
I think I may have found the answer.
After trying a couple of configuration of the command line: this one works:
./tparente & ps --no-headers -C tparente -o rss,vsz >> "mem_results"
The difference is subtle (there's no "sh")
This line is from the script; what I noticed is when I tried to run the script on it's and run a ps command in another terminal, the tparente process was is there. I don't know why, but my instinct told me to remove the sh and I did and it works.
If anyone has a proper explanation go ahead :)
I know this has been asked many times, but I can find a suitable answer in my case.
I croned a backup script using rsync and would like to see all output, errors or not, from the all script commands. I must write the command inside the script itself, and do not want to see output in my shell.
I have been trying with no success. Below part of the script.
#!/bin/bash
.....
BKLOG=/mnt/backup_error_$now.txt
# Log everything to log file
# something like
exec 2>&1 | tee $BKLOG
# OR
exec &> $BKLOG
I have been adding at the script beginig all kinds of exec | tee $BKLOG with adding &>, 2>&1at various part of the command line, but all failed. I either get an empty log file or incomplete. I need to see on log file what rsync has done, and the error if script failed before syncing.
Thank you for help. My shell is zsh, so any solution in zsh is welcomed.
To redirect all the stdout/stderr to a file place this line on top of your script:
BKLOG=/mnt/backup_error_$now.txt
exec &> "$BKLOG"
I have a tftp script here that when run it just hangs and brings me to a blank line (which tells me it's hanging). I can quit the script by Ctrl+C...
#!/bin/bash
hostname=$1;
filename=$2;
tftp <</dev/null
mode binary
get $hostname:$filename
quit
I have also tried to add EOF at the end of the script, but that doesn't work either.
Here is my command line...
$ ./tftpShell.sh host1 myFileName >/home/aayerd200/tftpoutput.txt 2>/home/aayerd200/tftperror.log
So when I run the script, it just leaves me on a blank line. However, it does actually do the work it should with get, I do get the file I want.
Of course host1 and myFileName are actual fields that I replaced here for security.
How can I stop this script? I believe it is just tftp hanging upon $ ps -u aayerd200, or when run by php $ ps -u daemon
You have /dev/null as a here document "delimiter" Try some random set of characters like EOF that have no meaning to the shell. And terminate the here doc
tftp <<-EOF
mode binary
get $hostname:$filename
quit
EOF
Okay so I just made this a background process by appending & to the end of the command. Then I ran $ echo $! for the PID. Then I ran $ kill PID.
That was my solution to this, for now at least.
When I run nohup some_command &, the output goes to nohup.out; man nohup says to look at info nohup which in turn says:
If standard output is a terminal, the
command's standard output is appended
to the file 'nohup.out'; if that
cannot be written to, it is appended
to the file '$HOME/nohup.out'; and if
that cannot be written to, the command
is not run.
But if I already have one command using nohup with output going to /nohup.out and I want to run another, nohup command, can I redirect the output to nohup2.out?
nohup some_command &> nohup2.out &
and voila.
Older syntax for Bash version < 4:
nohup some_command > nohup2.out 2>&1 &
For some reason, the above answer did not work for me; I did not return to the command prompt after running it as I expected with the trailing &. Instead, I simply tried with
nohup some_command > nohup2.out&
and it works just as I want it to. Leaving this here in case someone else is in the same situation. Running Bash 4.3.8 for reference.
Above methods will remove your output file data whenever you run above nohup command.
To Append output in user defined file you can use >> in nohup command.
nohup your_command >> filename.out &
This command will append all output in your file without removing old data.
As the file handlers points to i-nodes (which are stored independently from file names) on Linux/Unix systems You can rename the default nohup.out to any other filename any time after starting nohup something&. So also one could do the following:
$ nohup something&
$ mv nohup.out nohup2.out
$ nohup something2&
Now something adds lines to nohup2.out and something2 to nohup.out.
my start.sh file:
#/bin/bash
nohup forever -c php artisan your:command >>storage/logs/yourcommand.log 2>&1 &
There is one important thing only. FIRST COMMAND MUST BE "nohup", second command must be "forever" and "-c" parameter is forever's param, "2>&1 &" area is for "nohup". After running this line then you can logout from your terminal, relogin and run "forever restartall" voilaa... You can restart and you can be sure that if script halts then forever will restart it.
I <3 forever