Why can't I redirect stdout of a python script to a file - bash

I am starting my service running on a Raspberry Pi 2 (Raspbian) using a command in rc.local which looks like this:
python3.4 /home/pi/SwitchService/ServiceStart.py >/home/pi/SwitchService/log &
python3.4 /home/pi/test.py >/home/pi/log2 &
For some reason I don't see any text in the log file of my service although the script prints to stdout.
the two scripts look like this:
test.py
print("Test")
ServiceStart.py
from Server import Server
print("Test")
if __name__ == "__main__":
server = Server()
Because I couldn't get the bash solution to work I tried this other solution whether that works for me. It behaves exactly the same like the bash based method. So my service writes nothing to the log file although the empty file is created.

First, make sure that your script is actually running. Many schedulers and startup routines don't have PATH set, so it may not be finding python3.4. Try modifying the command to include the full path (e.g. /full/path/python3.4).
Secondly, it's not recommended to include long running scripts in rc.local without running them in the background (the documentation even states this). The Raspberry Pi waits for the commands to finish before continuing to boot, so if it runs forever, your Raspberry Pi may never finish booting.
Lastly, assuming the previous two issues have been taken care of, make sure that your program isn't buffering output too aggressively. You can try flushing stdout to see if that helps.

Related

Getting jupyter output in Google Colab

We need to see the output (stdout) that is generated by the jupyter notebook in google colab. Doing some investigation it seems that the output is piped up all the way to the main process:
root 1 0 0 Jun27 ? 00:00:00 /bin/bash -e /datalab/run.sh
The pipeline that the output runs through seems to be as follows:
/usr/bin/python2 /usr/local/bin/jupyter-notebook .....
/tools/node/bin/node /datalab/web/app.js
node /tools/node/bin/forever ..... /datalab/web/app.js
/bin/bash -e /datalab/run.sh
Any ideas on how I could access it?
I just discovered that forever doesn't forward the output from app.js. forever list suggests that the output is going to /content/.forever/BQBW.log which doesn't exist. I still don't understand why nor if this is really where stdout ends up.
Use wurlitzer. Here's a full example:
https://colab.research.google.com/drive/1jpAOdWJDCh_YzmqidGnlYHHCFODNKQkB
This notebook:
Saves a C file that prints to stdout.
Compiles as a shared library.
Loads the shared library into the running Python backend.
Uses wurlitzer to capture the output when invoking the library.
(I realize you're unblocked, but leaving this answer here to hopefully help future travelers)
Wurlitzer uses a thread to flush its pipes, and I guess your C++ code was crashing before its pipe was flushed.
https://colab.research.google.com/drive/1i6x882Dn6E5PwaptVQ4ADGyEvBZAHm7i shows an example where TF C++ code emits dev placement to stderr and then is killed before execution completes. Flushing quickly results in all the output showing up before the kernel is killed, but leaving it at the default (0.2s) results in partial or no output showing up.
If the output you want is an assert/FATAL message right before process death, wurlitzer's approach is unlikely to work for you, and running as a subprocess is likely to be a faster iteration path, e.g. write the code you'd have in the cell out to a file e.g. using %%writefile and then run a subprocess python like:
!python3 file.py
Any stdout/stderr the subprocess emits (whether from python code writing to sys.std{out,err} or C++ code writing to fd={1,2}) should show up in the cell's output.
Is the output you're looking for coming from your code or from Jupyter itself?
If it's jupyter, it takes a little work to enable logging -- here's a full example: https://colab.research.google.com/drive/1q2mhsj4bwwdQK-KZIxrIIKed8O11MQl0
I ended up writing a c++ wrapper for cout and py::print that can be used to enable or disable python printing. Pretty disgusting given the fact that I needed to change my entire c++ source to use the wrapper instead of std::cout.

Ruby run external program stops script

I have a ruby script that midway through I need it to run another program.
After running the program the rest of the script doesnt get run. For example:
# some ruby that gets run
exe = "Something.exe"
system(exe)
# some ruby that doesnt run
I have also tried using Open3.popen2e(cmd) and Open3.popen3(cmd) but its the same.
Can anyone help me understand what is happening here and how to fix it?
note: I'm using windows
Try to run Something.exe in a new Thread:
Thread.new { system("Something.exe") }
In case you want to run your System.exe asynchronously and continue without waiting it to be finished, you could use spawn or multithreading.
pid = spawn('System.exe')
Process.detach(pid)
According to this previous answer, this should work on Windows as well (while fork or other methods don't).
In this article you can find several examples using system, exec, fork, spawn and Thread on Unix.
I cannot reproduce it, but it could be worth to see if using system("start System.exe") works on windows like system("cmd &") works on UNIX. You can refer to start documentation here.

Init infinite loop on bootup (shell/Openwrt)

I've been trying to generate and infinite loop in OpenWRT, and I've succeeded:
#!/bin/sh /etc/rc.common
while [ true ]
do
# Code to run
sleep 15
done
This code works as a charm if I execute it as ./script. However, I want this to start on its own when I turn on my router. I've placed the script in /etc/init.dand enabled it with chmod +x script.
Regardless, the program doesn't start running at all. My guess is that I shouldn't execute this script on boot up but have a script that calls this other script. I haven't been able to work this out.
Any help would be appreciated.
As I have messed with init scripts of OpenWRT in my previous projects. I would like contribute to Rich Alloway's answer (for the ones who will likely to drop here from google search). His answer only covers for "traditional SysV style init scripts" as it is mentioned in the page that he gave link Init Scripts.
There is new process management daemon, Procd that you might find in your OpenWRT version. Sadly documentation of it has not been completed yet; Procd Init Scripts.
There are minor differences like they have pointed out in their documentation :
procd expects services to run in the foreground,
Different shebang,
line: #!/bin/sh /etc/rc.common Explicitly use procd USE_PROCD=1
start_service() instead of start()
A simple init script for procd would look like :
#!/bin/sh /etc/rc.common
# it is run order of your script, make it high to not mess up with other init scripts
START=100
USE_PROCD=1
start_service() {
procd_open_instance
procd_set_param command /target/to/your/useless/command -some -useless -shit -here
}
I have posted some blog post about it while ago that might help.
You need to have a file in /etc/rc.d/ with an Sxx prefix in order for the system to execute the script at boot time. This is usually accomplished by having the script in /etc/init.d and a symlink in /etc/rc.d pointing to the script.
The S indicates that the script should run at startup while the xx dictates the order that the script will run. Scripts are executed in naturally increasing order: S10boot runs before S40network and S50cron runs before S50dropbear.
Keep in mind that the system may not continue to boot with the script that you have shown here!
/etc/init.d/rcS calls each script sequentially and waits for the current one to exit before calling the next script. Since your script is an infinite loop, it will never exit and rcS may not complete the boot process.
Including /etc/rc.common will be more useful if you use functions in your script like start(), stop(), restart(), etc and add START and STOP variables which describe when the script should be executed during boot/shutdown.
Your script can then be used to enable and disable itself at boot time by creating or removing the symlink: /etc/init.d/myscript enable
See also OpenWRT Boot Process and Init Scripts
-Rich Alloway (RogueWave)

Gps python 2 program crashs bash file execution, raspberry pi

I have recently encountered a problem with executing python scripts on the Raspberry Pi using bash files. I have two programs, the first need not be mentioned as there are no problems running it. The second however reads GPS data from my USB GPS receiver and once it has been executed from a bash file based in the terminal it ends the running of the infinite loop bash file. It is in python 2 and the first program is in python 3. I cannot combine them into one program as the GPS program is not supported in python 3 due to modules not being updated. The code for the GPS program can be found here.
The only difference my code has is that I use a while loop to make it execute 10 times instead of infinitely. I want to prevent this program ending the bash file prematurely so it runs infinitely. I've tried running the bash file from another bash file so if a crash happens the script would be restarted but it seems that the terminal no longer takes any inputs after the GPS program has finished executing. Any help on this would be appreciated greatly!
My bash file looks like this:
#!/bin/sh
while true; do
echo "Running programs"
sh ./programtimer.sh
sleep 20
done
programtimer.sh simply runs the two python programs, the GPS program being run second.

Redirecting cmd stdout and stderr back to parent

If I create a process from a cmd prompt using the start command (opening a new cmd) is it possible to redirect the stdout and stderr from that process back to the calling cmd?
If you want the output of the STARTed process to appear in the parent command console, then simply use the START /B option.
If you want to process the output of your command, then you should use FOR /F ... in ('someCommand') DO ... instead.
OK. I have yet to find a straightforward answer to this question. I didn't want to bog down my question with what I thought unnecessary detail but seeing as I'm being criticized for the lack of this I'll expand a bit here.
I want to automate the updating of FWs on our production line, so I've a python app that gets the FWs from ftp and then uses the processors flash tool via python subprocess command to upload it to the board's flash. OK for all but one of the tools.
The one tool seems to have a problem when it's not running in its own terminal, so I provide a start to the subprocess command string which allows it to run OK in its own terminal. However, I need the output from this other terminal for logging reasons.
A possible solution was to log stdout and stderr to file using >> or wintee and then poll this file via a thread in my original app (perhaps a rather convoluted solution to the question). However the tool also has a separate problem where it doesn't like any std redirection, so this doesn't work for me.

Resources