stream stdout from other program without for loop - go

I have a program that compile and run another program and pipe the stdout to itself for printing, since that program doesn't terminate so I need to stream it's stdout
// boilerplate ommited
func stream(stdoutPipe io.ReadCloser) {
buffer := make([]byte, 100, 1000)
for ;; {
n, err := stdoutPipe.Read(buffer)
if err == io.EOF {
stdoutPipe.Close()
break
}
buffer = buffer[0:n]
os.Stdout.Write(buffer)
}
}
func main() {
command := exec.Command("go", "run", "my-program.go")
stdoutPipe, _ := command.StdoutPipe()
_ = command.Start()
go stream(stdoutPipe)
do_my_own_thing()
command.Wait()
}
It works, but how do I do the same without repeatedly checking with a for loop, is there a library function that does the same thing?

You can give the exec.Cmd an io.Writer to use as stdout. The variable your own program uses for stdout (os.Stdout) is also an io.Writer.
command := exec.Command("go", "run", "my-program.go")
command.Stdout = os.Stdout
command.Start()
command.Wait()

Related

How to redirect os.Stdout to io.MultiWriter() in Go?

I am writing a Test function that testing go program interacting with a command line program. That is
os.Stdio -> cmd.Stdin
cmd.Stdin -> os.Stdin
I could use pipe to connect those io, but I would have a log of the data passing the pipe.
I tried to use io.MultiWriter but it is not a os.file object and cannot assign to os.Stdout.
I have found some sample which use a lot of pipe and io.copy. but as io.copy is not interactive. How could I connect the Stdout to a io.MultiWriter with pipe?
logfile, err := os.Create("stdout.log")
r, w, _ := os.Pipe()
mw := io.MultiWriter(os.Stdout, logfile, w)
cmd.Stdin = r
os.Stdout = mw // <- error in this line
The error message like
cannot use mw (variable of type io.Writer) as type *os.File in assignment:
As an alternative solution, you can mimic the MultiWriter using a separate func to read what has been written to the stdout, capture it, then write it to the file and also to the original stdout.
package main
import (
"bufio"
"fmt"
"os"
"time"
)
func main() {
originalStdout := os.Stdout //Backup of the original stdout
r, w, _ := os.Pipe()
os.Stdout = w
//Use a separate goroutine for non-blocking
go func() {
f, _ := os.Create("stdout.log")
defer f.Close()
scanner := bufio.NewScanner(r)
for scanner.Scan() {
s := scanner.Text() + "\r\n"
f.WriteString(s)
originalStdout.WriteString(s)
}
}()
//Test
c := time.NewTicker(time.Second)
for {
select {
case <-c.C:
fmt.Println(time.Now())
fmt.Fprintln(os.Stderr, "This is on Stderr")
}
}
}
I figure out a work around with pipe, TeeReader and MultiWriter
The setup is a test function which test the interaction of go program interact with a python program though stdin and stdout.
main.stdout -> pipe -> TeeReader--> (client.stdin, MultiWriter(log, stdout))
client.stdout -> MultiWriter(pipe --> main.stdin, logfile, stdout )
I will try to add more explanation later
func Test_Interactive(t *testing.T) {
var tests = []struct {
file string
}{
{"Test_Client.py"},
}
for tc, tt := range tests {
fmt.Println("---------------------------------")
fmt.Printf("Test %d, Test Client:%v\n", tc+1, tt.file)
fmt.Println("---------------------------------")
// Define external program
client := exec.Command("python3", tt.file)
// Define log file
logfile, err := os.Create(tt.file + ".log")
if err != nil {
panic(err)
}
defer logfile.Close()
out := os.Stdout
defer func() { os.Stdout = out }() // Restore original Stdout
in := os.Stdin
defer func() { os.Stdin = in }() // Restore original Stdin
// Create pipe connect os.Stdout to client.Stdin
gr, gw, _ := os.Pipe()
// Connect os.Stdout to writer side of pipe
os.Stdout = gw
// Create MultiWriter to write to logfile and os.Stdout at the same time
gmw := io.MultiWriter(out, logfile)
// Create a tee reader read from reader side of the pipe and flow to the MultiWriter
// Repleace the cmd.Stdin with TeeReader
client.Stdin = io.TeeReader(gr, gmw)
// Create a pipe to connect client.Stdout to os.Stdin
cr, cw, _ := os.Pipe()
// Create MultWriter to client stdout
cmw := io.MultiWriter(cw, logfile, out)
client.Stdout = cmw
// Connect os stdin to another end of the pipe
os.Stdin = cr
// Start Client
client.Start()
// Start main
main()
// Check Testing program error
if err := client.Process.Release(); err != nil {
if exiterr, ok := err.(*exec.ExitError); ok {
if status, ok := exiterr.Sys().(syscall.WaitStatus); ok {
log.Printf("Exit Status: %d", status.ExitStatus())
t.Errorf("Tester return error\n")
}
} else {
log.Fatalf("cmd.Wait: %v", err)
t.Errorf("Tester return error\n")
}
}
}
}

Thread-safe operation with Stdout and Stderr (exec. Cmd)

I have code and it works in the correct way, but it isn’t thread-safety https://play.golang.org/p/8EY3i1Uk_aO in these rows race happens here.
stdout := cmd.Stdout.(*bytes.Buffer).String()
stderr := cmd.Stderr.(*bytes.Buffer).String()
I rewrote it in this way
readout, _ := cmd.StdoutPipe()
readerr, _ := cmd.StderrPipe()
The link
https://play.golang.org/p/htbn2zXXeQk
I don’t like that MultiReader is used here and I cannot separate data stdout from stderr
r, _ := bufio.NewReader(io.MultiReader(readout, readerr)).ReadString('\n')
Also the second example doesn’t work (it is commented in the code). I expected stdout not to be empty (like here https://play.golang.org/p/8EY3i1Uk_aO)
How to make the logic like in the first example, but it should be thread-safety?
You have to pump cmd.Stdout and cmd.Stderr in separate goroutines until they are closed, for example, like you did with cmd.Stdin (but in other direction, of course). Otherwise there's a risk of deadlock - the process is blocked waiting to write to stdout/stderr, and your program is blocked waiting for the process to finish.
Or, like #JimB said, just assign string buffers to cmd.Stdout and cmd.Stderr, they will be filled as the process runs.
func invoke(cmd *exec.Cmd) (stdout string, stderr string, err error) {
stdoutbuf, stderrbuf := new(strings.Builder), new(strings.Builder)
cmd.Stdout = stdoutbuf
cmd.Stderr = stderrbuf
err = cmd.Start()
if err != nil {
return
}
err = cmd.Wait()
return stdoutbuf.String(), stderrbuf.String(), err
}
Live demo:
https://play.golang.org/p/hakSVNbqirB
I used the advice of #rusty, anyway the race
line `runner.go:264 ' is
append(normalizeEncoding(stdoutbuf.String()), normalizeEncoding(stderrbuf.String()), false)
solution:
create your own Writer wrapper
type lockedWriter struct {
sync.RWMutex
buf []byte
w io.Writer
}
func (w *lockedWriter) Write(b []byte) (n int, err error) {
w.Lock()
defer w.Unlock()
w.buf = append(w.buf, b...)
return w.w.Write(b)
}
func (w *lockedWriter) String() string {
w.RLock()
defer w.RUnlock()
return string(w.buf)
}
usage
stdoutbuf, stderrbuf := &lockedWriter{w: new(strings.Builder)}, &lockedWriter{w: new(strings.Builder)}
cmd.Stdout = stdoutbuf
cmd.Stderr = stderrbuf

Golang os/exec flushing stdin without closing it

I would like to manage a process in Go with the package os/exec. I would like to start it and be able to read the output and write several times to the input.
The process I launch in the code below, menu.py, is just a python script that does an echo of what it has in input.
func ReadOutput(rc io.ReadCloser) (string, error) {
x, err := ioutil.ReadAll(rc)
s := string(x)
return s, err
}
func main() {
cmd := exec.Command("python", "menu.py")
stdout, err := cmd.StdoutPipe()
Check(err)
stdin, err := cmd.StdinPipe()
Check(err)
err = cmd.Start()
Check(err)
go func() {
defer stdin.Close() // If I don't close the stdin pipe, the python code will never take what I write in it
io.WriteString(stdin, "blub")
}()
s, err := ReadOutput(stdout)
if err != nil {
Log("Process is finished ..")
}
Log(s)
// STDIN IS CLOSED, I CAN'T RETRY !
}
And the simple code of menu.py :
while 1 == 1:
name = raw_input("")
print "Hello, %s. \n" % name
The Go code works, but if I don't close the stdin pipe after I write in it, the python code never take what is in it. It is okay if I want to send only one thing in the input on time, but what is I want to send something again few seconds later? Pipe is closed! How should I do? The question could be "How do I flush a pipe from WriteCloser interface?" I suppose
I think the primary problem here is that the python process doesn't work the way you might expect. Here's a bash script echo.sh that does the same thing:
#!/bin/bash
while read INPUT
do echo "Hello, $INPUT."
done
Calling this script from a modified version of your code doesn't have the same issue with needing to close stdin:
func ReadOutput(output chan string, rc io.ReadCloser) {
r := bufio.NewReader(rc)
for {
x, _ := r.ReadString('\n')
output <- string(x)
}
}
func main() {
cmd := exec.Command("bash", "echo.sh")
stdout, err := cmd.StdoutPipe()
Check(err)
stdin, err := cmd.StdinPipe()
Check(err)
err = cmd.Start()
Check(err)
go func() {
io.WriteString(stdin, "blab\n")
io.WriteString(stdin, "blob\n")
io.WriteString(stdin, "booo\n")
}()
output := make(chan string)
defer close(output)
go ReadOutput(output, stdout)
for o := range output {
Log(o)
}
}
The Go code needed a few minor changes - ReadOutput method needed to be modified in order to not block - ioutil.ReadAll would have waited for an EOF before returning.
Digging a little deeper, it looks like the real problem is the behaviour of raw_input - it doesn't flush stdout as expected. You can pass the -u flag to python to get the desired behaviour:
cmd := exec.Command("python", "-u", "menu.py")
or update your python code to use sys.stdin.readline() instead of raw_input() (see this related bug report: https://bugs.python.org/issue526382).
Even though there is some problem with your python script. The main problem is the golang pipe. A trick to solve this problem is use two pipes as following:
// parentprocess.go
package main
import (
"bufio"
"log"
"io"
"os/exec"
)
func request(r *bufio.Reader, w io.Writer, str string) string {
w.Write([]byte(str))
w.Write([]byte("\n"))
str, err := r.ReadString('\n')
if err != nil {
panic(err)
}
return str[:len(str)-1]
}
func main() {
cmd := exec.Command("bash", "menu.sh")
inr, inw := io.Pipe()
outr, outw := io.Pipe()
cmd.Stdin = inr
cmd.Stdout = outw
if err := cmd.Start(); err != nil {
panic(err)
}
go cmd.Wait()
reader := bufio.NewReader(outr)
log.Printf(request(reader, inw, "Tom"))
log.Printf(request(reader, inw, "Rose"))
}
The subprocess code is the same logic as your python code as following:
#!/usr/bin/env bash
# menu.sh
while true; do
read -r name
echo "Hello, $name."
done
If you want to use your python code you should do some changes:
while 1 == 1:
try:
name = raw_input("")
print "Hello, %s. \n" % name
sys.stdout.flush() # there's a stdout buffer
except:
pass # make sure this process won't die when come across 'EOF'
// StdinPipe returns a pipe that will be connected to the command's
// standard input when the command starts.
// The pipe will be closed automatically after Wait sees the command exit.
// A caller need only call Close to force the pipe to close sooner.
// For example, if the command being run will not exit until standard input`enter code here`
// is closed, the caller must close the pipe.
func (c *Cmd) StdinPipe() (io.WriteCloser, error) {}

How to process stderr in go?

I have an app called "myapp". That app simply writes to stderr.
The important bit is, I want to capture what is written in stderr and process it in real-time. How would I go about doing that?
I tried the code below. :
cmd := exec.Command("myapp") // this app prints lines to stderr
stderr, err := cmd.StderrPipe()
if err != nil {
log.Fatal(err)
}
if err := cmd.Start(); err != nil {
log.Fatal(err)
}
if b, err := ioutil.ReadAll(stderr); err == nil {
log.Println(string(b))
}
if err := cmd.Wait(); err != nil {
log.Fatal(err)
}
The code doesn't print out anyting. I suspect it's because ioutil.ReadAll() is not the proper func to call since it waits for EOF. How else would I read from the stderr pipe?
You can replace the command executed with anything that outputs to stdout or stderr like tail -f mylogfile. The point is, I want to process the lines as they are written to stdout.
StderrPipe returns a ReadCloser. You can use that to create a bufio.Scanner and then read lines one by one:
sc := bufio.NewScanner(stderr)
for sc.Scan() {
fmt.Printf("Line: %s\n", sc.Text());
}
Create a type that implements io.Writer and set that as the command's stderr writer.
type Processor struct{}
func (Processor) Write(b []byte) (int, error) {
// intercept data here
return os.Stdout.Write(b)
}
func main() {
cmd := exec.Command("mycommand")
cmd.Stderr = Processor{}
_ = cmd.Run()
}

Communicating with program process using pipes

I want to be able to fully communicate with some programs after spawning them from Golang program. What I already have is spawning process and talking through pipes based on last line read from stdout:
package main
import (
"fmt"
"io"
"log"
"os/exec"
"strings"
)
var stdinPipe io.WriteCloser
var stdoutPipe io.ReadCloser
var err error
func main() {
cmd := &exec.Cmd{
Path: "/Users/seba/Projects/go/src/bootstrap/in",
Args: []string{"program"},
}
stdinPipe, err = cmd.StdinPipe()
if err != nil {
log.Fatal(err)
}
stdoutPipe, err = cmd.StdoutPipe()
if err != nil {
log.Fatal(err)
}
err = cmd.Start()
if err != nil {
log.Fatal(err)
}
var stdoutLines []string
go stdoutManage(stdoutLines, stdoutController)
cmd.Wait()
}
// TODO: imporove as in io.Copy
func stdoutManage(lines []string, manager func(string)) {
buf := make([]byte, 32*1024)
for {
nr, err := stdoutPipe.Read(buf)
if nr > 0 {
thelines := strings.Split(string(buf), "\n")
for _, l := range thelines {
manager(l)
lines = append(lines, l)
}
}
buf = make([]byte, 32*1024) // clear buf
if err != nil {
break
}
}
}
However this approach have problems with programs clearing terminal output and programs which somehow buffer it's stdin or don't use stdin at all (don't know if it's possible).
So the question: is there a portable way of talking with programs (it can be non-Golang solution)?
Problems like this are usually to do with the C library which changes its default buffering mode depending on exactly what stdin / stdout / stderr are.
If stdout is a terminal then buffering is automatically set to line buffered, else it is set to buffered.
This is relevant to you because when you run the programs through a pipe they aren't connected to a terminal and so will have buffering which messes up this sort of use.
To fix, you need to use a pseudo tty which pretends to be a terminal but acts just like a pipe. Here is a library implementing the pty interface which I haven't actually tried but it looks like it does the right thing!

Resources