How to view mgo's bson.Raw in "pretty" text - go

I want to be able to output raw bson data I am getting from golang's mgo library to the console for debugging purposes, but have been unable to find out how to accomplish this.
With JSON I do it like this:
formatedData, err := json.MarshalIndent(rawData, "", " ")
if err != nil {
log.Print(err)
}
fmt.Printf("%s", formatedData)
Is there an equivalent way to do this with BSON?

bson is a binary format and it is just a slice of bytes. It is in itself not readable by a human as this format holds information about the length of fields, etc. and holds all data very compact. It is already encoded, so there is no need to marshal it.
You can output as it is, but it will be not readable.
See the bson spec here: http://bsonspec.org/#/specification
If you want to see all contents of the bson, you could Unmarshal it into a map:
m := map[string]interface{}{}
rawData.Unmarshal(&m)
fmt.Printf("%+v\n", m)

Related

Invalid unmarshalling of proto

I am trying to decode proto data. But the proto is not properly decoded.
This is how I am doing:
decodedStr, err := base64.StdEncoding.DecodeString(request.Body)
if err != nil {
panic("malformed input")
}
data := &tracepb.ExportTraceServiceRequest{}
if err := proto.Unmarshal(decodedStr, data); err != nil {
log.Fatalln("Failed to parse:", err)
}
log.Printf("Response - %v", data)
The output is like this:
Response - resource_spans:{resource:{attributes:{key:"service.name" value:{string_value:"node_app"}} attributes:{key:"telemetry.sdk.language" value:{string_value:"nodejs"}} attributes:{key:"telemetry.sdk.name" value:{string_value:"opentelemetry"}} attributes:{key:"telemetry.sdk.version" value:{string_value:"1.8.0"}} attributes:{key:"process.pid" value:{int_value:1}} attributes:{key:"process.executable.name" value:{string_value:"node"}} attributes:{key:"process.command" value:{string_value:"/usr/app/index.js"}} attributes:{key:"process.command_line" value:{string_value:"/usr/local/bin/node /usr/app/index.js"}} attributes:{key:"process.runtime.version" value:{string_value:"18.13.0"}} attributes:{key:"process.runtime.name" value:{string_value:"nodejs"}} attributes:{key:"process.runtime.description" value:{string_value:"Node.js"}}} scope_spans:{scope:{name:"#opentelemetry/instrumentation-express" version:"0.32.0"} spans:{trace_id:"\xb5\x81\x91\x8b\x02\x9a/\xf1\x08\x06\xaf~\xea\x9fQ\xc0" span_id:"T\x06\x89m\x1ex\xf9A" parent_span_id:"?\xbc\x18`O\xa5\xb8\xe1" name:"middleware - query" kind:SPAN_KIND_INTERNAL start_time_unix_nano:1673434036590614272 end_time_unix_nano:1673434036590671104 attributes:{key:"http.route" value:{string_value:"/"}} attributes:{key:"express.name" value:{string_value:"query"}} attributes:{key:"express.type" value:{string_value:"middleware"}} status:{}} spans:{trace_id:"\xb5\x81\x91\x8b\x02\x9a/\xf1\x08\x06\xaf~\xea\x9fQ\xc0" span_id:"\xd5c\xf7>\xf6Cxz" parent_span_id:"?\xbc\x18`O\xa5\xb8\xe1" name:"middleware - expressInit" kind:SPAN_KIND_INTERNAL start_time_unix_nano:1673434036590760704
Not sure why traceId is shown like this:
spans:{trace_id:"\xb5\x81\x91\x8b\x02\x9a/\xf1\x08\x06\xaf~\xea\x9fQ\xc0"
Am new to GoLang. Any help would be greatly appreciated
The trace_id field appears to contain the ID as binary data instead of hex. The generated proto String method will render the binary data as a string. Hence non-printable characters as displayed as ASCII escape sequences.
If you want to display the data in other format (eg, Hex) you will need to implement your own function to render the proto.
Used encoding/hex module's hex.EncodeToString() function to convert bytes to hex

Retrieval after serialization to disk using gob

I have been learning about databases and wanted to implement one as well for learning purposes and not for production. I have a defined schema:
type Row struct {
ID int32
Username string
Email string
}
Now, currently, i am able to encode structs of this type to a file in an append only manner.
//Just to show i use a file for the encoding, it has missing details.
func NewEncoder(db *DB) *gob.Encoder{
return gob.NewEncoder(db.File)
}
func SerializeRow(r Row, encoder *gob.Encoder, db *DB) {
err := encoder.Encode(r)
if err != nil {
log.Println("encode error:", err)
}
}
Now, it's relatively easy to mimick a "select" statement by simply decoding the entire file using gob.decode
func DeserializeRow(decoder *gob.Decoder, db *DB){
var rows Row
db.File.Seek(0, 0)
err := decoder.Decode(&rows)
for err == nil {
if err != nil {
log.Println("decode error:", err)
}
fmt.Printf("%d %s %s\n", rows.ID, rows.Username, rows.Email)
err = decoder.Decode(&rows)
}
}
My current issue is, I want to be able to retrieve specific rows based on ID. I know sqlite uses 4kb paging, in the sense that serialized rows occupy a "page" ie. 4KB till a page can't hold them anymore, then another is created. How do I mimick such a behaviour using gob in the most simplistic and idiomatic way?
Seen: I have seen this and this
A Gob stream may contain type definitions and decoding instructions, so you can't seek a Gob stream. You can only read it from the beginning up to the point you find what you need.
A Gob stream is completely unsuitable for a database storage format in which you need to skip elements.
You could create a new encoder and serialize each records separately, in which case you could skip elements (by maintaining a file index storing which record starts at which position), but it would be terribly inefficient and redundant (as described in the linked answer, the speed and storage cost amortizes as you write more values of the same type, and always creating new encoders loses this gain).
A much better approach would be to not use encoding/gob for this, but rather define your own format. To efficiently support searches (select), you have to build some kind of index on the searchable columns / fields, else you still need to perform a full table scan.

Get Json from url and parse into a multidimensional array without struct [duplicate]

This question already has an answer here:
Unmarshal JSON into map
(1 answer)
Closed 3 years ago.
How do I parse json from a url into a multidimensional array without a struct?
Is this even possible to do in Go?
I've seen a lot of different answers on stack, and other sites. But not one without a struct.
Of course this is possible in Go, however without using structs it can become quite cumbersome.
If you already know the depth of the resulting array, you can define your slice using the empty interface (e.g. for a depth of 3):
var decoded [][][]interface{}
If you don't know the depth, use a normal []interface{} instead and combine it with type assertions.
After that, a normal json.Unmarshal call will produce the desired result:
err := json.Unmarshal(respData, &decoded)
if err != nil {
panic(err)
}
Example link
Now let’s look at decoding JSON data into Go values. Here’s an example for a generic data structure.
byt := []byte(`{"num":6.13,"strs":["a","b"]}`)
We need to provide a variable where the JSON package can put the decoded data. This map[string]interface{} will hold a map of strings to arbitrary data types.
var dat map[string]interface{}
Here’s the actual decoding, and a check for associated errors.
if err := json.Unmarshal(byt, &dat); err != nil {
panic(err)
}
fmt.Println(dat)
In order to use the values in the decoded map, we’ll need to convert them to their appropriate type. For example here we convert the value in num to the expected float64 type.
num := dat["num"].(float64)
fmt.Println(num)
Accessing nested data requires a series of conversions.
strs := dat["strs"].([]interface{})
str1 := strs[0].(string)
fmt.Println(str1)
See: https://gobyexample.com/json
You can work with any types with type casting data.(your_type)

How to output json format errors in Golang rest, specifically referencing the bad fields

I have the following requirement: return errors from a REST API in the following format:
Error format
422
{
"name-of-field": [
"can't be blank",
"is too silly"
]
}
My code looks like this:
var PostFeedback = func(w http.ResponseWriter, r *http.Request) {
params := mux.Vars(r)
surveyId := params["id"]
feedback := &models.Feedback{}
err := json.NewDecoder(r.Body).Decode(feedback)
if err != nil {
jsonError := fmt.Sprintf(`{
"%s": [
"%s"
]
}`, "errors", err)
log.Printf("invalid input format, %v", jsonError)
resp := map[string]interface{}{"error": jsonError}
u.Respond(w, resp)
return
}
Questions:
How do I get the names of the offending fields?
How do I satisfy the requirement best?
The encoding/json package doesn't provide validation for "blank", nor "silly" values. It will return an error only if the data in the body is not a valid json, or if the field types in the json do not, according to the package's spec, match the field types of the structure into which you're trying to decode that json.
The 1st type of error would be the json.SyntaxError, if you get this it is not always possible to satisfy your requirements since there may be no actual fields which you could use in your response, or if there are json fields, they, and their values, may be perfectly valid json, but the cause of the error may lie elsewhere (see example).
In cases where the data holds actual json fields but it has, for example, non-json values you could use the Offset field of the SyntaxError type to find the closest preceding field in the data stream. Using strings.LastIndex you can implement a naive solution to look backwards for the field.
data := []byte(`{"foobar": i'm not json}`)
err := json.Unmarshal(data, &T{})
se, ok := err.(*json.SyntaxError)
if !ok {
panic(err)
}
field := string(data[:se.Offset])
if i := strings.LastIndex(field, `":`); i >= 0 {
field = field[:i]
if j := strings.LastIndex(field, `"`); j >= 0 {
field = field[j+1:]
}
}
fmt.Println(field) // outputs foobar
Playground link
NOTE: As you can see, for you to be able to look for the field, you need to have access to the data, but when you're using json.NewDecoder and passing it the request's body directly, without first storing its contents somewhere, you'll loose access to that data once the decoder's Decode method is done. This is because the body is a stream of bytes wrapped in a io.ReadCloser that does not support "rewinding", i.e. you cannot re-read bytes that the decoder already read. To avoid this you can use ioutil.ReadAll to read the full contents of the body and then json.Unmarshal to do the decoding.
The 2nd type of error would be the json.UnmarshalTypeError. If you look at the documentation of the error type and its fields you'll know that all you need to do is to type assert the returned value and you're done. Example
Validation against "blank" and "silly" values would be done after the json has been successfully decoded into your structure. How you do that is up to you. For example you could use a 3rd party package that's designed for validating structs, or you can implement an in-house solution yourself, etc. I actually don't have an opinion on which one of them is the "best" so I can't help you with that.
What I can say is that the most basic approach would be to simply look at each field of the structure and check if its value is valid or not according to the requirements for that field.

Arbitrary JSON data structure in go

I'm building an http api and every one of my handlers returns JSON data, so I built a wrapper function that handles the JSON marshalling and http response (I've included the relevant section from the wrapper as well as one of the sample handlers below).
What is the best way to pass arbitrarily nested structs (the structs also contain arbitrary types/number of fields). Right now I've settled on a map with string keys and interface{} values. This works, but is this the most idiomatic go way to do this?
result := make(map[string]interface{})
customerList(httpRequest, &result)
j, err := json.Marshal(result)
if err != nil {
log.Println(err)
errs := `{"error": "json.Marshal failed"}`
w.Write([]byte(errs))
return
}
w.Write(j)
func customerList(req *http.Request, result *map[string]interface{}) {
data, err := database.RecentFiftyCustomers()
if err != nil {
(*result)["error"] = stringifyErr(err, "customerList()")
return
}
(*result)["customers"] = data//data is a slice of arbitrarily nested structs
}
If you do not know in advance what types, what structure and which nesting you get, there is no option but to decode it into something generic like map[string]interface{}. So nothing "idiomatic" or "non-idiomatic" here.
(Personally I'd try to somehow fix the structs and not have "arbitrary" nestings, and combinations.)

Resources