I have written two simple applications on Golang. The first app fetch.go displays html content of the main page by incoming links in arguments. The second app findlinks1.go find all href links in html tree. Here's the snippets:
func main() {
for _, url := range os.Args[1:] {
resp, err := http.Get(url)
if err != nil {
fmt.Fprintf(os.Stderr, "fetch: %v\n", err)
os.Exit(1)
}
b, err := ioutil.ReadAll(resp.Body)
resp.Body.Close()
if err != nil {
fmt.Fprintf(os.Stderr, "fetch: reading %s: %v\n", url, err)
os.Exit(1)
}
fmt.Printf("%s", b)
}
}
func main() {
doc, err := html.Parse(os.Stdin)
if err != nil {
fmt.Fprintf(os.Stderr, "findlinks1: %v\n", err)
}
for _, link := range visit(nil, doc) {
fmt.Println(link)
}
//it doesn't matter what the visit function does
}
I want to redirect the output of the 1st program to the input of the 2nd program in powershell terminal in Goland development environment, but I can't.
I have tried to run these commands in terminal:
./fetch.go https://golang.org | ./findlinks1.go
I got an error:
InvalidOperation: Cannot run a document in the middle of a pipeline
go run fetch.go https://golang.org | go run findlinks1.go
I didn't get an error, but nothing happened after running this
Related
I have implemented a CLI using go and I display the status of kubernetese cells. The command is cellery ps
func ps() error {
cmd := exec.Command("kubectl", "get", "cells")
stdoutReader, _ := cmd.StdoutPipe()
stdoutScanner := bufio.NewScanner(stdoutReader)
go func() {
for stdoutScanner.Scan() {
fmt.Println(stdoutScanner.Text())
}
}()
stderrReader, _ := cmd.StderrPipe()
stderrScanner := bufio.NewScanner(stderrReader)
go func() {
for stderrScanner.Scan() {
fmt.Println(stderrScanner.Text())
if (stderrScanner.Text() == "No resources found.") {
os.Exit(0)
}
}
}()
err := cmd.Start()
if err != nil {
fmt.Printf("Error in executing cell ps: %v \n", err)
os.Exit(1)
}
err = cmd.Wait()
if err != nil {
fmt.Printf("\x1b[31;1m Cell ps finished with error: \x1b[0m %v \n", err)
os.Exit(1)
}
return nil
}
However cells need time to get into ready state when they are deployed. Therefore I need to give a flag(wait) which would update the CLI output.
The command would be cellery ps -w . However Kubernetese API have not implemented this yet. So I will have to come up with a command.
Basically what you want is to listen to the event of a cell become ready.
You can register to the events in a cluster and act upon them. A good example can be found here
I wonder if possible to copy running .exe file to another folder. I am trying to do this using usual copy approach in Go like that.
func copy(src, dst string) error {
in, err := os.Open(src)
if err != nil {
return err
}
defer in.Close()
out, err := os.Create(dst)
if err != nil {
return err
}
defer out.Close()
_, err = io.Copy(out, in)
if err != nil {
return err
}
return out.Close()
}
...
copyErr := copy(os.Args[0], "D:"+"\\"+"whs.exe")
if copyErr != nil {
log.Panicf("copy -> %v", copyErr)
}
The file copied with the same size but I can't open it correctly. I have only a fast cmd flash. After several milliseconds, cmd is closing and I can't see even any errors.
I was trying to write errors to log file but it's empty.
f, err := os.OpenFile("debug.log", os.O_RDWR|os.O_CREATE|os.O_APPEND, 0777)
if err != nil {
log.Panicf("setLogOutput -> %v", err)
}
defer f.Close()
log.SetOutput(f)
If I open not copied .exe file everything works correctly.
I've reduced my program to only one main method. The result was the same.
func main() {
log.Println("Starting...")
copyErr := copy(os.Args[0], "F:"+"\\"+"whs.exe")
if copyErr != nil {
log.Panicf("copy -> %v", copyErr)
}
os.Stdin.Read([]byte{0})
}
I have found an error.
The process cannot access the file because it is being used by another process.
I was trying to copy the .exe file to its own path.
func copy(src, dst string) error {
if _, err := os.Stat(dst); os.IsNotExist(err) {
in, err := os.Open(src)
if err != nil {
return err
}
defer in.Close()
out, err := os.Create(dst)
if err != nil {
return err
}
defer out.Close()
_, err = io.Copy(out, in)
if err != nil {
return err
}
}
return nil
}
I have the following code which works, I need to execute commands in chain that need to finish before the other command is executed.
I do it with the wait command with ugly ifElse and if I need to chain more command it become uglier...is there a better way to write it in go?
cmd, buf := exec.CommandContext("npm", dir+"/"+path, "install")
//Wait
if err := cmd.Wait(); err != nil {
log.Printf("executing npm install returned error: %v", err)
} else {
log.Println(buf.String())
gulpCmd, gulpBuf := exec.CommandContext(“gulp”, pdir+"/"+n.path)
//Wait
if err := gulpCmd.Wait(); err != nil {
log.Printf("error: %v", err)
} else {
log.Println(gulpBuf.String())
pruneCmd, pruneBuf := exec.CommandContext("npm", pdir+"/"+n.path, "prune", "--production")
//Wait
if err := pruneCmd.Wait(); err != nil {
log.Printf("error: %v", err)
} else {
log.Println(pruneBuf.String())
}
}
update:
if I try to run this simple program it works and I get message
added 563 packages in 19.218s*
This is the code
cmd := exec.Command("npm", "install")
cmd.Dir = filepath.Join(pdir, n.path)
cmdOutput := &bytes.Buffer{}
cmd.Stdout = cmdOutput
err := cmd.Run()
if err != nil {
os.Stderr.WriteString(err.Error())
}
fmt.Print(string(cmdOutput.Bytes()))
But If I try like following, I get error and it not able to execute the first command which is npm install, any idea?
cmdParams := [][]string{
{"npm", filepath.Join(dir,path), "install"},
{"gulp", filepath.Join(pdir,n.path)},
{"npm", filepath.Join(pdir, n.path), "prune", "--production"},
}
for _, cmdParam := range cmdParams {
out, err := exec.Command(cmdParam[0], cmdParam[1:]...).Output()
if err != nil {
log.Printf("error running %s: %v\n", cmdParam[0], err)
return
}
log.Println(string(out))
}
The error I get is error running npm: exit status 1
update 2
The commands are and should be run one after another, when the first finish just then run the gulp etc, and also I need to provide the output from the commands
1. npm install
2. gulp
3. npm prune
List your commands in a slice, and use a simple loop to execute all sequentially. And use filepath.Join() to build folders.
Also I'm not sure what package you're using to run the commands, using os/exec we can simplify further the execution of the commands in the loop body. For example Command.Output() runs the command and returns its standard output:
cmdParams := [][]string{
{filepath.Join(dir,path), "npm", "install"},
{filepath.Join(pdir,n.path), "gulp"},
{filepath.Join(pdir, n.path), "npm", "prune", "--production"},
}
for _, cp := range cmdParams {
log.Printf("Starting %s in folder %s...", cp[1:], cp[0])
cmd := exec.Command(cp[1], cp[2:]...)
cmd.Dir = cp[0]
// Wait to finish, get output:
out, err := cmd.Output()
if err != nil {
log.Printf("Error running %s: %v\n", cp[1:], err)
return
}
log.Println("Finished %s, output: %s", cp[1:], out)
}
To avoid ugly if-else you can write code like this:
err := someFunction1()
if err != nil {
return err
}
err := someFunction2()
if err != nil {
return err
}
err := someFunction3()
if err != nil {
return err
}
but you will have ugly (IMHO) multiply return statements
I'm trying to download a remote file over ssh
The following approach works fine on shell
ssh hostname "tar cz /opt/local/folder" > folder.tar.gz
However the same approach on golang giving some difference in output artifact size. For example the same folders with pure shell produce artifact gz file 179B and same with go script 178B.
I assume that something has been missed from io.Reader or session got closed earlier. Kindly ask you guys to help.
Here is the example of my script:
func executeCmd(cmd, hostname string, config *ssh.ClientConfig, path string) error {
conn, _ := ssh.Dial("tcp", hostname+":22", config)
session, err := conn.NewSession()
if err != nil {
panic("Failed to create session: " + err.Error())
}
r, _ := session.StdoutPipe()
scanner := bufio.NewScanner(r)
go func() {
defer session.Close()
name := fmt.Sprintf("%s/backup_folder_%v.tar.gz", path, time.Now().Unix())
file, err := os.OpenFile(name, os.O_APPEND|os.O_WRONLY|os.O_CREATE, 0644)
if err != nil {
panic(err)
}
defer file.Close()
for scanner.Scan() {
fmt.Println(scanner.Bytes())
if err := scanner.Err(); err != nil {
fmt.Println(err)
}
if _, err = file.Write(scanner.Bytes()); err != nil {
log.Fatal(err)
}
}
}()
if err := session.Run(cmd); err != nil {
fmt.Println(err.Error())
panic("Failed to run: " + err.Error())
}
return nil
}
Thanks!
bufio.Scanner is for newline delimited text. According to the documentation, the scanner will remove the newline characters, stripping any 10s out of your binary file.
You don't need a goroutine to do the copy, because you can use session.Start to start the process asynchronously.
You probably don't need to use bufio either. You should be using io.Copy to copy the file, which has an internal buffer already on top of any buffering already done in the ssh client itself. If an additional buffer is needed for performance, wrap the session output in a bufio.Reader
Finally, you return an error value, so use it rather than panic'ing on regular error conditions.
conn, err := ssh.Dial("tcp", hostname+":22", config)
if err != nil {
return err
}
session, err := conn.NewSession()
if err != nil {
return err
}
defer session.Close()
r, err := session.StdoutPipe()
if err != nil {
return err
}
name := fmt.Sprintf("%s/backup_folder_%v.tar.gz", path, time.Now().Unix())
file, err := os.OpenFile(name, os.O_APPEND|os.O_WRONLY|os.O_CREATE, 0644)
if err != nil {
return err
}
defer file.Close()
if err := session.Start(cmd); err != nil {
return err
}
n, err := io.Copy(file, r)
if err != nil {
return err
}
if err := session.Wait(); err != nil {
return err
}
return nil
You can try doing something like this:
r, _ := session.StdoutPipe()
reader := bufio.NewReader(r)
go func() {
defer session.Close()
// open file etc
// 10 is the number of bytes you'd like to copy in one write operation
p := make([]byte, 10)
for {
n, err := reader.Read(p)
if err == io.EOF {
break
}
if err != nil {
log.Fatal("err", err)
}
if _, err = file.Write(p[:n]); err != nil {
log.Fatal(err)
}
}
}()
Make sure your goroutines are synchronized properly so output is completeky written to the file.
I'm trying to set up a kind of print service for a website to communicate with and send printable documents to (pdf, html, excel). I decided on Go.
I created the simple program below. On some PCs it works (Windows 7) on other ones (Windows 8) it doesn't work (right). When it doesn't work the job is visible in the print queue for less then a second and then disappears. The code doesn't output any errors. I can't find anything in the Windows event log.
I also tried a RawPrinter example in c++ I could find online but that shows the same behavior.
Does anyone know what I'm doing wrong? :(
package main
import (
"fmt"
"code.google.com/p/brainman/printer"
)
func main() {
defaultPrinterName, _ := printer.Default()
fmt.Println(defaultPrinterName)
p, err := printer.Open(defaultPrinterName)
if err != nil {
fmt.Println("Open failed: %v", err)
}
defer p.Close()
err = p.StartDocument("my document", "RAW")
if err != nil {
fmt.Println("StartDocument failed: %v", err)
}
defer p.EndDocument()
err = p.StartPage()
if err != nil {
fmt.Println("StartPage failed: %v", err)
}
str := "testing 123"
mySlice := []byte(str)
_, err = p.Write(mySlice)
if err != nil {
fmt.Println("Write failed: %v", err)
}
err = p.EndPage()
if err != nil {
fmt.Println("EndPage failed: %v", err)
}
}
You're using the datatype "RAW", it should be "XPS_PASS".
Windows 8 (and Server 2012) uses XPS-based drivers so you can't use the RAW flag.
Check out these articles:
http://support.microsoft.com/kb/2779300
http://msdn.microsoft.com/en-us/library/windows/desktop/ff686812%28v=vs.85%29.aspx