Go Stdin Stdout communication - go

I want to execute a go file which I will specify in a yaml config file and send it a Struct in bytes. How could I do this?
I thought that I could use Stdin and Stdout for this
But can’t figure it out
Yaml config:
subscribers:
temp:
topics:
- pi/+/temp
action: ./temp/tempBinary
this is my code:
client.Subscribe(NewTopic(a), func(c *Client, msg Message) {
cmd := exec.Command(v.Action)
// I actually want to send [msg] to it so it can be used there
cmd.Stdin = bytes.NewReader(msg.Bytes())
if err := cmd.Start(); err != nil {
c.Logger.Infof("Error while executing action: %v", err)
} else {
c.Logger.Info("Executed command")
}
// I want to handle responses from the called binary
var out bytes.Buffer
cmd.Stdout = &out
c.Logger.Infof("Response: %v", out)
})
I can't figure out how exactly I could do this.

There is a good example of what you need at https://golang.org/pkg/os/exec/#example_Cmd_StdinPip, https://golang.org/pkg/os/exec/#example_Cmd_StdoutPipe and https://golang.org/pkg/io/ioutil/#example_ReadAll
A decent start would be something like:
stdin, err := cmd.StdinPipe()
if err != nil {
log.Fatal(err)
}
stdout, err := cmd.StdoutPipe()
if err != nil {
log.Fatal(err)
}
defer stdin.Close()
io.WriteString(stdin, msg.String())
b, err := ioutil.ReadAll(stdout)
if err != nil {
log.Fatal(err)
}
c.Logger.Infof("Response %s", stdout)
But this solution doesn't even begin to handle edge cases such as pipes being closed early etc.
This video does a good job of talking through stuff like this:
https://www.youtube.com/watch?v=LHZ2CAZE6Gs&feature=youtu.be&list=PL6

Related

Reading os.OpenFile in Golang while still being written?

I have code that is writing to a logfile while executing a system command. E.g.
logfile, err := os.OpenFile(THIS_LOG_FILE, os.O_APPEND|os.O_WRONLY|os.O_CREATE, 0600)
if err != nil {
return err
}
cmd.Stderr = logfile
cmd.Stdout = logfile
go func() {
err := cmd.Run()
if err != nil {
// WANT TO LOG ERROR HERE
}
}()
At the "// WANT TO LOG" line, I'd like to output the content to the standard logger, in addition to the previously assigned logfile destination. Is there a way to capture this in memory? Or should I just write everything to an in-memory buffer and flush at the end?
To clarify, in capturing the output of the command in memory, I can parse it and take action in the running program (handling errors/etc). When I write to the log file, that information is lost.
My issue is that, theoretically, I could read that back in from the file I just wrote, but that seems wasteful (and prone to failure if the command failed).
If I understand correctly, you want to write the content of stdout/stderror to a file while executing a shell command.
Since stdout and stderror are implemented the ReadCloser interface, you can merge them by io.MultiReader and perform io.Copy from source to destination.
The following snippet implements the pipeline
package main
import (
"io"
"log"
"os"
"os/exec"
)
func main() {
// prepare the command
cmd := exec.Command("your-shell-command.sh")
// get the stdout and stderr stream
erc, err := cmd.StderrPipe()
if err != nil {
log.Fatalln("Failed to get stderr reader: ", err)
}
orc, err := cmd.StdoutPipe()
if err != nil {
log.Fatalln("Failed to get stdout reader: ", err)
}
// combine stdout and stderror ReadCloser
rc := io.MultiReader(erc, orc)
// Prepare the writer
f, err := os.Create("output.log")
if err != nil {
log.Fatalln("Failed to create file")
}
defer f.Close()
// Command.Start starts a new go routine
if err := cmd.Start(); err != nil {
log.Println("Failed to start the command")
}
// add the TeeReader.
var buf bytes.Buffer
tr := io.TeeReader(rc, &buf)
if _, err := io.Copy(f, tr); err != nil {
logger.Fatalf("Failed to stream to file: %s", err)
}
if err := cmd.Wait(); err != nil {
log.Println("Failed to wait the command to execute: ", err)
}
}

How to capture the bytes of stdin

The goal: I want to capture all the bytes of cmd.Stdin and process them with this rot13 function: https://play.golang.org/p/VX2pwaIqhmT
The story: I'm coding a small tool which will be cross compiled for both win/ linux, so I'm trying to make it as simple as possible. This tool connects to the server from which I can execute commands on the client.
Since I had to do the same thing for cmd.Stdout, I used this:
.......
conn, err := net.Dial(nObj.Type, nObj.TCPIndirizzo)
......
cmd := exec.Command(/bin/sh, "-i") // please keep in mind that this is an ***interactive***
//***shell***, and not just a simple command
cmd.Stdin = conn
cmdStdout, err := cmd.StdoutPipe() // works fine
if err != nil {
fmt.Fprintf(os.Stderr, "error creating shell stdout pipe: %s\n", err)
}
cmd.Stderr = conn
err = cmd.Start()
if err != nil {
fmt.Fprintf(os.Stderr, "error starting shell: %s\n", err)
}
.....
err = OBFprocessStream(cmdStdout, conn) // works fine
....
Where OBFprocessStream function is based on this one: https://play.golang.org/p/j_TKZWuhGaK. Everything works fine here .
So, I tried to replicate the same thing for cmd.Stdin:
.......
conn, err := net.Dial(nObj.Type, nObj.TCPIndirizzo)
......
cmd := exec.Command(/bin/sh, "-i")
cmdStdin, err := cmd.StdinPipe()
if err != nil {
fmt.Fprintf(os.Stderr, "error creating shell stdin pipe: %s\n", err)
}
cmdStdout, err := cmd.StdoutPipe()
if err != nil {
fmt.Fprintf(os.Stderr, "error creating shell stdout pipe: %s\n", err)
}
cmd.Stderr = conn
err = cmd.Start()
if err != nil {
fmt.Fprintf(os.Stderr, "error starting shell: %s\n", err)
}
.....
err = INOBFprocessStream(cmdStdin, conn)
....
.....
err = OBFprocessStream(cmdStdout, conn)
....
But.. cmdStdin is an Io.WriterCloser, and I don't really know what to do to capture the bytes sEGIHOsegoihszrhoiò
Can you please help me?
So it seems what you actually want is to read the data from conn, filter it with ROT13 and then pass it to cmd.Stdin (which accepts an io.Reader).
And your rot13Reader is already implementing io.Reader:
type rot13Reader struct {
r io.Reader
}
func (r13 *rot13Reader) Read(b []byte) (int, error) {
n, err := r13.r.Read(b)
for i := 0; i <= n; i++ {
b[i] = rot13(b[i])
}
return n, err
}
So a quick solution can be to construct a small filter chain out of it like so:
cmd.Stdin = &rot13Reader{conn}

script command execution hung forever in go program

func Run() error {
log.Info("In Run Command")
cmd := exec.Command("bash", "/opt/AlterKafkaTopic.sh")
stdout, err := cmd.StdoutPipe()
if err != nil {
return err
}
if err = cmd.Start(); err != nil {
return err
}
f, err := os.Create(filepath.Join("/opt/log/", "execution.log"))
if err != nil {
return err
}
if _, err := io.Copy(f, stdout); err != nil {
return err
}
if err := cmd.Wait(); err != nil {
return err
}
return f.Close()
}
I am trying to execute a bash script from go code. The script changes some kafka topic properties. But the execution get hung io.Copy(f, stdout) and does not continue after it.
This program is running on RHEL7.2 server.
Could someone suggest where I am going wrong
From the docs:
Wait will close the pipe after seeing the command exit.
In other words, io.Copy exits when Wait() is called, but Wait is never called because it's blocked by Copy. Either run Copy in a goroutine, or simply assign f to cmd.Stdout:
f, err := os.Create(filepath.Join("/opt/log/", "execution.log"))
// TODO: Handle error
defer f.Close()
cmd := exec.Command("bash", "/opt/AlterKafkaTopic.sh")
cmd.Stdout = f
err = cmd.Run()

Is there a good way to cancel a blocking read?

I've got a command I have to run via the OS, let's call it 'login', that is interactive and therefore requires me to read from the stdin and pass it to the command's stdin pipe in order for it to work correctly. The only problem is the goroutine blocks on a read from stdin and I haven't been able to find a way to cancel a Reader in Go in order to get it to not hang on the blocking call. For example, from the perspective of the user, after the command looks as if it completed, you still have the opportunity to write to stdin once more (then the goroutine will move past the blocking read and exit)
Ideally I would like to avoid having to parse output from the command's StdoutPipe as that makes my code frail and prone to error if the strings of the login command were to change.
loginCmd := exec.Command("login")
stdin , err := loginCmd.StdinPipe()
if err != nil {
return err
}
out, err := loginCmd.StdoutPipe()
if err != nil {
return err
}
if err := loginCmd.Start(); err != nil {
return err
}
ctx, cancel := context.WithCancel(context.TODO())
var done sync.WaitGroup
done.Add(1)
ready := make(chan bool, 1)
defer cancel()
go func(ctx context.Context) {
reader := bufio.NewReader(os.Stdin)
for {
select {
case <- ctx.Done():
done.Done()
return
default:
//blocks on this line, if a close can unblock the read, then it should exit normally via the ctx.Done() case
line, err :=reader.ReadString('\n')
if err != nil {
fmt.Println("Error: ", err.Error())
}
stdin.Write([]byte(line))
}
}
}(ctx)
var bytesRead = 4096
output := make([]byte, bytesRead)
reader := bufio.NewReader(out)
for err == nil {
bytesRead, err = reader.Read(output)
if err != nil && err != io.EOF {
return err
}
fmt.Printf("%s", output[:bytesRead])
}
if err := loginCmd.Wait(); err != nil {
return err
}
cancel()
done.Wait()

golang scp file using crypto/ssh

I'm trying to download a remote file over ssh
The following approach works fine on shell
ssh hostname "tar cz /opt/local/folder" > folder.tar.gz
However the same approach on golang giving some difference in output artifact size. For example the same folders with pure shell produce artifact gz file 179B and same with go script 178B.
I assume that something has been missed from io.Reader or session got closed earlier. Kindly ask you guys to help.
Here is the example of my script:
func executeCmd(cmd, hostname string, config *ssh.ClientConfig, path string) error {
conn, _ := ssh.Dial("tcp", hostname+":22", config)
session, err := conn.NewSession()
if err != nil {
panic("Failed to create session: " + err.Error())
}
r, _ := session.StdoutPipe()
scanner := bufio.NewScanner(r)
go func() {
defer session.Close()
name := fmt.Sprintf("%s/backup_folder_%v.tar.gz", path, time.Now().Unix())
file, err := os.OpenFile(name, os.O_APPEND|os.O_WRONLY|os.O_CREATE, 0644)
if err != nil {
panic(err)
}
defer file.Close()
for scanner.Scan() {
fmt.Println(scanner.Bytes())
if err := scanner.Err(); err != nil {
fmt.Println(err)
}
if _, err = file.Write(scanner.Bytes()); err != nil {
log.Fatal(err)
}
}
}()
if err := session.Run(cmd); err != nil {
fmt.Println(err.Error())
panic("Failed to run: " + err.Error())
}
return nil
}
Thanks!
bufio.Scanner is for newline delimited text. According to the documentation, the scanner will remove the newline characters, stripping any 10s out of your binary file.
You don't need a goroutine to do the copy, because you can use session.Start to start the process asynchronously.
You probably don't need to use bufio either. You should be using io.Copy to copy the file, which has an internal buffer already on top of any buffering already done in the ssh client itself. If an additional buffer is needed for performance, wrap the session output in a bufio.Reader
Finally, you return an error value, so use it rather than panic'ing on regular error conditions.
conn, err := ssh.Dial("tcp", hostname+":22", config)
if err != nil {
return err
}
session, err := conn.NewSession()
if err != nil {
return err
}
defer session.Close()
r, err := session.StdoutPipe()
if err != nil {
return err
}
name := fmt.Sprintf("%s/backup_folder_%v.tar.gz", path, time.Now().Unix())
file, err := os.OpenFile(name, os.O_APPEND|os.O_WRONLY|os.O_CREATE, 0644)
if err != nil {
return err
}
defer file.Close()
if err := session.Start(cmd); err != nil {
return err
}
n, err := io.Copy(file, r)
if err != nil {
return err
}
if err := session.Wait(); err != nil {
return err
}
return nil
You can try doing something like this:
r, _ := session.StdoutPipe()
reader := bufio.NewReader(r)
go func() {
defer session.Close()
// open file etc
// 10 is the number of bytes you'd like to copy in one write operation
p := make([]byte, 10)
for {
n, err := reader.Read(p)
if err == io.EOF {
break
}
if err != nil {
log.Fatal("err", err)
}
if _, err = file.Write(p[:n]); err != nil {
log.Fatal(err)
}
}
}()
Make sure your goroutines are synchronized properly so output is completeky written to the file.

Resources