How to handle stdin/stdout with dlv - go

I am using Delve to debug and having issues with the best way to handle stdin/stdout.
The first problem is that I cannot read the console. I have a function that takes use onput from the console:
func readConsole() string {
reader := bufio.NewReader(os.Stdin)
entry, err := reader.ReadString('\n')
if err != nil {
tlog.Fatal(fmt.Errorf("readConsole(): Error reading console input. %v", err))
}
entry = strings.Replace(entry, "\n", "", -1)
return entry
}
The following "bad file descriptor" error is returned by ReadString():
F0208 21:03:56.574021 429026 configurator.go:81] readConsole(): Error reading console input. read /dev/stdin: bad file descriptor
The second problem is that fmt.Printf() works when I just run the app, but if I am stepping through source code fmt.Printf() does not display anything.
I get that dlv is competing for input and output via the console, but not sure how to manage the competing requirements.

Related

How to close or clean a stdout pipe?

I have a program which makes an ssh connection to a new (every time the program is executed) gcp instance to retrieve information. The problem is that sometimes I got this error and I don't know why:
2019/08/22 12:30:37 ssh: Stdout already set
My code(avoiding error handle):
results := /home/example.txt
client, err := ssh.Dial("tcp", addrIP+":22", clientConfig)
session, err := client.NewSession()
defer session.Close()
data, err := session.Output(" cat " + results)
if err != nil {
log.Print("Fails when new output")
log.Fatal(err)
}
During the output is when the error occurs.
Calling session.Output would set the Stdout of the session to a buffer, then run the command provided, and return the contents in the buffer.
If the Stdout of this session is already set (for example, if you call the session.Output multiple times), an error of "Stdout already set" will be returned.
If you need to run multiple commands in one session, just manually set the Stdout to some buffer maintained by yourself, and use the session.Run() method instead of session.Output.

How to log fmt.Printf in custom file

I have this function which I use to log:
func formattedLog(prefix, m string, color int) {
fmt.Printf("\033[%dm%s", color, DateTimeFormat)
fmt.Printf("▶ %s: %s\033[%dm\n", prefix, m, int(Black))
}
I want to save my log output in some file:
f, err := os.OpenFile("../../../go-logs.txt", os.O_WRONLY|os.O_CREATE|os.O_APPEND, 0666)
if err != nil {
log.Fatal("error opening logs file", err)
}
defer f.Close()
//set output of logs to f
log.SetOutput(f)
log.Println("This is a test log entry") // <====This logs in file
but when I call my function, which uses fmt.Printf it doesn't log in the file go-logs.txt:
formattedErr("ERR", msg, err.Error(), int(Red))
is there anyway to setoutput also for fmt.Printf
fmt.Printf() documents that it writes to the standard output:
Printf formats according to a format specifier and writes to standard output.
So there is no fmt.SetOutput() to redirect that to your file.
But note that the standard output is a variable in the os package:
Stdin, Stdout, and Stderr are open Files pointing to the standard input, standard output, and standard error file descriptors.
Note that the Go runtime writes to standard error for panics and crashes; closing Stderr may cause those messages to go elsewhere, perhaps to a file opened later.
var (
Stdin = NewFile(uintptr(syscall.Stdin), "/dev/stdin")
Stdout = NewFile(uintptr(syscall.Stdout), "/dev/stdout")
Stderr = NewFile(uintptr(syscall.Stderr), "/dev/stderr")
)
And you are allowed to set your own os.File to os.Stdout. Although it's not a good idea to use the same os.File for a logger and to also set it to os.Stdout, access to its File.Write() method would not be synchronized between the fmt package and the logger.
Best would be to use a log.Logger everywhere (whose output you properly set, so log messages would properly be serialized).

Error when trying to parse trace using go tool trace

I was following tutorial from JustForFunc episode 22
Added those two lines at the start of main() in main.go:
trace.Start(os.Stdout)
defer trace.Stop()
build the binary using go build -o appName
Timed it with time ./appName > m.trace
And finally tried to open trace with go tool trace m.trace
but got following error:
2017/11/10 19:15:38 Parsing trace...
failed to parse trace: unknown event type 50 at offset 0x16
Little more background on my code(golang 1.9, linux) : it is a server for GET requests built with gin-gonic. I added extra line of code time.AfterFunc(20*time.Seconds, func(){closeServer()}) to close my server after 20 seconds so I could make few request to it and then stop server exiting program.
I found a solution to my problem.
I followed this tutorial https://making.pusher.com/go-tool-trace/.
Added code to main :
f, err := os.Create("trace.out")
if err != nil {
panic(err)
}
defer f.Close()
err = trace.Start(f)
if err != nil {
panic(err)
}
defer trace.Stop()
// Your program here
And it seems to be working fine. I have no idea what could cause this problem :(

How to resume reading after EOF in named pipe

I'm writing a program which opens a named pipe for reading, and then processes any lines written to this pipe:
err = syscall.Mkfifo("/tmp/myfifo", 0666)
if err != nil {
panic(err)
}
pipe, err := os.OpenFile("/tmp/myfifo", os.O_RDONLY, os.ModeNamedPipe)
if err != nil {
panic(err)
}
reader := bufio.NewReader(pipe)
scanner := bufio.NewScanner(reader)
for scanner.Scan() {
line := scanner.Text()
process(line)
}
This works fine as long as the writing process does not restart or for other reasons send an EOF. When this happens, the loop terminates (as expected from the specifications of Scanner).
However, I want to keep the pipe open to accept further writes. I could just reinitialize the scanner of course, but I believe this would create a race condition where the scanner might not be ready while a new process has begun writing to the pipe.
Are there any other options? Do I need to work directly with the File type instead?
From the bufio GoDoc:
Scan ... returns false when the scan stops, either by reaching the end of the input or an error.
So you could possibly leave the file open and read until EOF, then trigger scanner.Scan() again when the file has changed or at a regular interval (i.e. make a goroutine), and make sure the pipe variable doesn't go out of scope so you can reference it again.
If I understand your concern about a race condition correctly, this wouldn't be an issue (unless write and read operations must be synchronized) but when the scanner is re-initialized it will end up back at the beginning of the file.

Streaming commands output progress

I'm writing a service that has to stream output of a executed command both to parent and to log. When there is a long process, the problem is that cmd.StdoutPipe gives me a final (string) result.
Is it possible to give partial output of what is going on, like in shell
func main() {
cmd := exec.Command("sh", "-c", "some long runnig task")
stdout, _ := cmd.StdoutPipe()
cmd.Start()
scanner := bufio.NewScanner(stdout)
for scanner.Scan() {
m := scanner.Text()
fmt.Println(m)
log.Printf(m)
}
cmd.Wait()
}
P.S. Just to output would be:
cmd.Stdout = os.Stdout
But in my case it is not enough.
The code you posted works (with a reasonable command executed).
Here is a simple "some long running task" written in Go for you to call and test your code:
func main() {
fmt.Println("Child started.")
time.Sleep(time.Second*2)
fmt.Println("Tick...")
time.Sleep(time.Second*2)
fmt.Println("Child ended.")
}
Compile it and call it as your command. You will see the different lines appear immediately as written by the child process, "streamed".
Reasons why it may not work for you
The Scanner returned by bufio.NewScanner() reads whole lines and only returns something if a newline character is encountered (as defined by the bufio.ScanLines() function).
If the command you execute doesn't print newline characters, its output won't be returned immediately (only when newline character is printed, internal buffer is filled or the process ends).
Possible workarounds
If you have no guarantee that the child process prints newline characters but you still want to stream the output, you can't read whole lines. One solution is to read by words, or even read by characters (runes). You can achieve this by setting a different split function using the Scanner.Split() method:
scanner := bufio.NewScanner(stdout)
scanner.Split(bufio.ScanRunes)
The bufio.ScanRunes function reads the input by runes so Scanner.Scan() will return whenever a new rune is available.
Or reading manually without a Scanner (in this example byte-by-byte):
oneByte := make([]byte, 1)
for {
_, err := stdout.Read(oneByte)
if err != nil {
break
}
fmt.Printf("%c", oneByte[0])
}
Note that the above code would read runes that multiple bytes in UTF-8 encoding incorrectly. To read multi UTF-8-byte runes, we need a bigger buffer:
oneRune := make([]byte, utf8.UTFMax)
for {
count, err := stdout.Read(oneRune)
if err != nil {
break
}
fmt.Printf("%s", oneRune[:count])
}
Things to keep in mind
Processes have default buffers for standard output and for standard error (usually the size of a few KB). If a process writes to the standard output or standard error, it goes into the respective buffer. If this buffer gets full, further writes will block (in the child process). If you don't read the standard output and standard error of a child process, your child process may hang if the buffer is full.
So it is recommended to always read both the standard output and error of a child process. Even if you know that the command don't normally write to its standard error, if some error occurs, it will probably start dumping error messages to its standard error.
Edit: As Dave C mentions by default the standard output and error streams of the child process are discarded and will not cause a block / hang if not read. But still, by not reading the error stream you might miss a thing or two from the process.
I found good examples how to implement progress output in this article by Krzysztof Kowalczyk

Resources