Golang bufio from websocket breaking after first read - go

I am trying to stream JSON text from a websocket. However after an initial read I noticed that the stream seems to break/disconnect. This is from a Pleroma server (think: Mastodon). I am using the default Golang websocket library.
package main
import (
"bufio"
"fmt"
"log"
"golang.org/x/net/websocket"
)
func main() {
origin := "https://poa.st/"
url := "wss://poa.st/api/v1/streaming/?stream=public"
ws, err := websocket.Dial(url, "", origin)
if err != nil {
log.Fatal(err)
}
s := bufio.NewScanner(ws)
for s.Scan() {
line := s.Text()
fmt.Println(line)
}
}
After the initial JSON text response, the for-loop breaks. I would expect it to send a new message every few seconds.
What might be causing this? I am willing to switch to the Gorilla websocket library if I can use it with bufio.
Thanks!

Although x/net/websocket connection has a Read method with the same signature as the Read method in io.Reader, the connection does not work like an io.Reader. The connection will not work as you expect when wrapped with a bufio.Scanner.
The poa.st endpoint sends a stream of messages where each message is a JSON document. Use the following code to read the messages using the Gorilla package:
url := "wss://poa.st/api/v1/streaming/?stream=public"
ws, _, err := websocket.DefaultDialer.Dial(url, nil)
if err != nil {
log.Fatal(err)
}
defer ws.Close()
for {
_, p, err := ws.ReadMessage()
if err != nil {
log.Fatal(err)
}
// p is a []byte containing the JSON document.
fmt.Printf("%s\n", p)
}
The Gorilla package has a helper method for decoding JSON messages. Here's an example of how to use that method.
url := "wss://poa.st/api/v1/streaming/?stream=public"
ws, _, err := websocket.DefaultDialer.Dial(url, nil)
if err != nil {
log.Fatal(err)
}
defer ws.Close()
for {
// The JSON documents are objects containing two fields,
// the event type and the payload. The payload is a JSON
// document itself.
var e struct {
Event string
Payload string
}
err := ws.ReadJSON(&e)
if err != nil {
log.Fatal(err)
}
// TODO: decode e.Payload based on e.Event
}

Related

How to use self-describing message for protbuf

One of the use cases I'm working on while using protocol buffers is to deserialize the Protocol Buffers Kafka messages which I receive at the consumer end (using sarama library and Go).
The way how i'm doing currently is i defined the sample pixel.proto file as show below.
syntax = "proto3";
package saramaprotobuf;
message Pixel {
// Session identifier stuff
string session_id = 2;
}
i'm sending the message through sarama.Producer(by marshalling it) receiving it sarama.Consumer (unmarshalling message it by referencing with complied pixel.proto.pb). Code is as below.
import (
"github.com/Shopify/sarama"
"github.com/golang/protobuf/proto"
"log"
"os"
"os/signal"
"protobuftest/example"
"syscall"
"time"
)
func main() {
topic := "test_topic"
brokerList := []string{"localhost:9092"}
producer, err := newSyncProducer(brokerList)
if err != nil {
log.Fatalln("Failed to start Sarama producer:", err)
}
go func() {
ticker := time.NewTicker(time.Second)
for {
select {
case t := <-ticker.C:
elliot := &example.Pixel{
SessionId: t.String(),
}
pixelToSend := elliot
pixelToSendBytes, err := proto.Marshal(pixelToSend)
if err != nil {
log.Fatalln("Failed to marshal example:", err)
}
msg := &sarama.ProducerMessage{
Topic: topic,
Value: sarama.ByteEncoder(pixelToSendBytes),
}
producer.SendMessage(msg)
log.Printf("Pixel sent: %s", pixelToSend)
}
}
}()
signals := make(chan os.Signal, 1)
signal.Notify(signals, syscall.SIGHUP, syscall.SIGINT, syscall.SIGTERM)
partitionConsumer, err := newPartitionConsumer(brokerList, topic)
if err != nil {
log.Fatalln("Failed to create Sarama partition consumer:", err)
}
log.Println("Waiting for messages...")
for {
select {
case msg := <-partitionConsumer.Messages():
receivedPixel := &example.Pixel{}
err := proto.Unmarshal(msg.Value, receivedPixel)
if err != nil {
log.Fatalln("Failed to unmarshal example:", err)
}
log.Printf("Pixel received: %s", receivedPixel)
case <-signals:
log.Print("Received termination signal. Exiting.")
return
}
}
}
func newSyncProducer(brokerList []string) (sarama.SyncProducer, error) {
config := sarama.NewConfig()
config.Producer.RequiredAcks = sarama.WaitForAll
config.Producer.Retry.Max = 5
config.Producer.Return.Successes = true
// TODO configure producer
producer, err := sarama.NewSyncProducer(brokerList, config)
if err != nil {
return nil, err
}
return producer, nil
}
func newPartitionConsumer(brokerList []string, topic string) (sarama.PartitionConsumer, error) {
conf := sarama.NewConfig()
// TODO configure consumer
consumer, err := sarama.NewConsumer(brokerList, conf)
if err != nil {
return nil, err
}
partitionConsumer, err := consumer.ConsumePartition(topic, 0, sarama.OffsetOldest)
if err != nil {
return nil, err
}
return partitionConsumer, err
}
In the code as you can see I have imported the .proto file and referencing it in the main function inorder to send and receive the message. The problem here is, the solution is not generic. I will receive the message of different .proto type at the consumer end.
How can I make it generic? I know there is something called as self-describing message(dynamic message) as the part of protobuf. I referred this link https://developers.google.com/protocol-buffers/docs/techniques?csw=1#self-description . But it doesn't has any explaination on how to embed this as the part of pixel.proto(example which i have used) so that at the consumer end i came directly deserialize it to required type.
You would define a generic container message type that would include a DescriptorSet and an Any fields.
When sending, you build an instance of that generic message type, setting the field of type Any with an instance of your Pixel message and setting the DescriptorSet field with the DescriptorSet of the Pixel type.
That would allow the receiver of such message to parse the Any contents using the DescriptorSet you are attaching. In practical terms, this is sending a piece of proto definition together with the message. So receivers wouldn't need pre-shared proto definitions or generated code.
Having said that, I'm not sure this is what you really want because if you are planning to share proto definitions or generated code with clients then I'd suggest simply using a oneof field in a container type would be much simpler to use.

Go-Gin read request body many times

I am trying to restore the context with it's data after performing validation on it's data.I need the data to keep moving as need it later on in the next function.
I am new to golang and the below code is as far I could go. any help and a better approach is much appreciated.
thanks in advance.
the validation middleware
func SignupValidator(c *gin.Context) {
// Read the Body content
// var bodyBytes []byte
// if c.Request.Body != nil {
// bodyBytes, _ = ioutil.ReadAll(c.Request.Body)
// }
var user entity.User
if err := c.ShouldBindJSON(&user); err != nil {
validate := validator.New()
if err := validate.Struct(&user); err != nil {
c.JSON(http.StatusBadRequest, gin.H{
"error": err.Error(),
})
c.Abort()
return
}
// c.Request.Body = ioutil.NopCloser(bytes.NewBuffer(bodyBytes))
}
// Read the Body content
var bodyBytes []byte
if c.Request.Body != nil {
bodyBytes, _ = ioutil.ReadAll(c.Request.Body)
}
fmt.Println(string(bodyBytes)) // this empty
c.Next()
}
route
auth.POST("login", gin.Logger(), validations.SignupValidator, func(ctx *gin.Context) {
ctx.JSON(200, videoController.Save(ctx))
})
You can try this.
ByteBody, _ := ioutil.ReadAll(c.Request.Body)
c.Request.Body = ioutil.NopCloser(bytes.NewBuffer(ByteBody))
You can then use ByteBody however you want without side-effects on c.Request.Body
Here is an example of reading body twice with ShouldBindBodyWith, check it:
package main
import (
"log"
"net/http"
"github.com/gin-gonic/gin"
"github.com/gin-gonic/gin/binding"
)
type ParamsOne struct {
Username string `json:"username"`
}
type ParamsTwo struct {
Username string `json:"username"`
}
func main() {
r := gin.New()
r.POST("/", func(c *gin.Context) {
var f ParamsOne
// Read ones
if err := c.ShouldBindBodyWith(&f, binding.JSON); err != nil {
log.Printf("%+v", err)
}
log.Printf("%+v", f)
var ff ParamsTwo
if err := c.ShouldBindBodyWith(&ff, binding.JSON); err != nil {
log.Printf("%+v", err)
}
log.Printf("%+v", ff)
c.IndentedJSON(http.StatusOK, f)
})
r.Run(":4000")
}
Output:
$example: ./example
[GIN-debug] [WARNING] Running in "debug" mode. Switch to "release" mode in production.
- using env: export GIN_MODE=release
- using code: gin.SetMode(gin.ReleaseMode)
[GIN-debug] POST / --> main.main.func1 (1 handlers)
[GIN-debug] Listening and serving HTTP on :4000
2020/07/05 10:47:03 {Username:somename}
2020/07/05 10:47:03 {Username:somename}
As #Philidor has shown ShouldBindBodyWith should do the trick, in my case I decided to go with something similar to #spehlivan, because of two reasons:
ShouldBindBodyWith requires that the following binds are also ShouldBindBodyWith, it means I need to change all my previous code, which uses c.Bind
You need to explicitly tell to ShouldBindBodyWith the binding type you are trying to do, JSON, Form, ProtoBuf, etc, other binds like c.Bind detects it automatically.
This is what it looks like:
var input models.SomeInput
bodyCopy := new(bytes.Buffer)
// Read the whole body
_, err := io.Copy(bodyCopy, c.Request.Body)
if err != nil {
log.Println(err)
c.JSON(http.StatusBadRequest, gin.H{"error": "Error reading API token"})
c.Abort()
return
}
bodyData := bodyCopy.Bytes()
// Replace the body with a reader that reads from the buffer
c.Request.Body = ioutil.NopCloser(bytes.NewReader(bodyData))
err = c.Bind(&input)
// Some code here...
// Replace the body with a reader that reads from the buffer
c.Request.Body = ioutil.NopCloser(bytes.NewReader(bodyData))
Pay attention I replaced the c.Request.Body twice, for the bind in that code snippet and then at the end, for the next bind, in another place of my code (this snippet is from a middleware, the next bind is called from the controller).
In my case I needed to do this because the API Token is sent in the request body, which I don't recommend, it should be sent in the request header.

golang unzip Response.Body

I wrote a little web crawler and had known that the Response is a zip file.
In my limited experience with golang programing, I only know how to unzip a existing file.
Can I unzip the Response.Body in memory without saving it in hard disk in advance?
Updating answer for handling Zip file response body in-memory.
Note: Ensure you have sufficient memory for handling zip file.
package main
import (
"archive/zip"
"bytes"
"fmt"
"io/ioutil"
"log"
"net/http"
)
func main() {
resp, err := http.Get("zip file url")
if err != nil {
log.Fatal(err)
}
defer resp.Body.Close()
body, err := ioutil.ReadAll(resp.Body)
if err != nil {
log.Fatal(err)
}
zipReader, err := zip.NewReader(bytes.NewReader(body), int64(len(body)))
if err != nil {
log.Fatal(err)
}
// Read all the files from zip archive
for _, zipFile := range zipReader.File {
fmt.Println("Reading file:", zipFile.Name)
unzippedFileBytes, err := readZipFile(zipFile)
if err != nil {
log.Println(err)
continue
}
_ = unzippedFileBytes // this is unzipped file bytes
}
}
func readZipFile(zf *zip.File) ([]byte, error) {
f, err := zf.Open()
if err != nil {
return nil, err
}
defer f.Close()
return ioutil.ReadAll(f)
}
By default Go HTTP client handles Gzip response automatically. So do typical read and close of response body.
However there is a catch in it.
// Reference https://github.com/golang/go/blob/master/src/net/http/transport.go
//
// DisableCompression, if true, prevents the Transport from
// requesting compression with an "Accept-Encoding: gzip"
// request header when the Request contains no existing
// Accept-Encoding value. If the Transport requests gzip on
// its own and gets a gzipped response, it's transparently
// decoded in the Response.Body. However, if the user
// explicitly requested gzip it is not automatically
// uncompressed.
DisableCompression bool
What it means is; If you add a header Accept-Encoding: gzip manually in the request then you have to handle Gzip response body by yourself.
For Example -
reader, err := gzip.NewReader(resp.Body)
if err != nil {
log.Fatal(err)
}
defer reader.Close()
body, err := ioutil.ReadAll(reader)
if err != nil {
log.Fatal(err)
}
fmt.Println(string(body))

Read response content from gorilla toolkit Client.get

I´m using Gorilla Toolkit for golang to request a web resource (GET) and I want to process the response body but don´t know how to access it. Here is my main.go
package main
import (
"log"
"github.com/gorilla/http"
)
func main() {
url := "http://ubuntu.com"
status, h, r, err := http.DefaultClient.Get(url, nil)
if err != nil {
log.Fatal(err)
}
if r != nil {
defer r.Close()
}
log.Printf("Status: %v", status)
log.Printf("Headers: %v", h)
var p []byte
_, err = r.Read(p)
if err != nil {
log.Fatal(err)
}
log.Printf("MSG: %v", p)
}
Gorillas response object is of the type io.ReadCloser and I can´t wrap my head around how to access it. Any help is appreciated.
Use ioutil.ReadAll to read the entire response body as a []byte:
status, h, r, err := http.DefaultClient.Get(url, nil)
if err != nil {
log.Fatal(err)
}
var p []byte
if r != nil {
p, err = ioutil.ReadAll(r)
r.Close()
if err != nil {
log.Fatal(err)
}
}
I suggest that you use the net/http client instead of the Gorilla client. There are more examples of how use the net/http client and the net/http client better maintained.

Using protobuf with golang and handling []byte HTTP response body

I am using the Golang protobuf package and try to write some tests to ensure my API works properly.
I construct an Object on the server-side with a generated .pb.go file.
And return it with
data, err := proto.Marshal(p)
fmt.Fprint(w, data)
And in my test I do
func TestGetProduct(t *testing.T) {
log.Println("Starting server")
go startAPITestServer()
time.Sleep(0 * time.Second)
log.Println("Server started")
//rq, err := http.NewRequest("GET", "localhost:8181/product/1", nil)
client := &http.Client{}
log.Println("Starting Request")
resp, err := client.Get("http://localhost:8181/product/1")
log.Println("Finished Request")
if err != nil {
t.Log(err)
}
defer resp.Body.Close()
log.Println("Reading Request")
data, err := ioutil.ReadAll(resp.Body)
log.Println("Reading finished")
if err != nil {
t.Log(err)
}
log.Println("HTTP Resp", data)
p := &Product{}
proto.UnmarshalText(string(data), p)
proto.Unmarshal(data, p2)
}
The Problem is that the HTTP Request is correct and displays the []byte correctly, but if I do ioutil.ReadAll it interprets the HTTP Response as a string and converts it to a []byte.
For example the response is
[12 3 2 14 41]
Then ioutil.ReadAll interprets this as a string and not as a []byte.
The problem was: I tried to write binary data to the output stream with fmt.Fprint missing the important fact, that the fmt package converts (everything?) input to a "read-able" format (ie strings).
The correct way of writting data into the output of your HTTP Response is using the responsewriter directly like this:
k, err := w.Write(data)

Resources