Why are unexported struct fields purposely not marshalled in the json package - go

In the json package if you want to utilize tags to marshall and unmarshall structs the field must be exported. I always thought this was a limitation on the reflect package but it seems to be explicitly rejected here.
I would like to know what I am missing or the design choices behind this as it creates one of the following issues normally.
Exporting more fields than I want allowing users to change values without using potential getters and setters.
A lot of extra code if you chose to solve the above problem by making a second identical struct just for tag usage and make it unexported.
Here is a playground of code being able to read unexported tags.
Edit: Better playground thanks to #mkopriva

Related

Gorm relationship and issues

I was creating my first-ever rest API in golang with fiber and form. I wanted to create a model that had a slice of strings, but gorm does not allow me to do so. SO, the next thing I tried was to use a map, hoping that it will be easily converted to JSON and saved to my postgres instance. But the same, gorm does not support maps. So, I created another struct into which I put all the data in a not-so-elegant way, where I made a single string value for each possible string I can save, and then I embedded this struct into the other. But now the compiler's complaints that I have to save a primary key into it, and not raw json given from the request. I am a bit overwhelmed by now
If someone knows I way that I can use to save all the data I need into the way that respects my requirements (slice of string, easy to parse when I read from the database), and to finish this CRUD app, I would really be thankful for that. thank you a lot

What's the best way to validate an untagged Go struct

Whenever I create a Go struct, I use struct tags so that I can easily validate it using go-playground/validator - for example:
type DbBackedUser struct {
Name sql.NullString `validate:"required"`
Age sql.NullInt64 `validate:"required"`
}
I'm currently integrating with a third party. They provide a Go SDK, but the structs in the SDK don't contain validate tags, and they are unable/unwilling to add them.
Because of my team's data integrity requirements, all of the data pulled in through this third-party SDK has to be validated before we can use it.
The problem is that there doesn't seem to be a way to validate a struct using the Validator package without defining validation tags on the struct.
Our current proposed solution is to map all of the data from the SDK's structs into our own structs (with validation tags defined on them), perform the validation on these structs, and then, if that validation passes, we can use the data directly from the SDK's structs (we need to pass the data around our system using the structs defined by the SDK - so these structs with the validation tags could only be used for validation). The problem here is that we would then have to maintain these structs that are only used for validation, also, the mapping between structs has both time and memory costs.
Is there a better way of handling struct validation in this scenario?
Alternative technique is to use validating constructors: NewFoobar(raw []byte) (Foobar, error)
But I see that go-validator supports your case with a "struct-level validation":
https://github.com/go-playground/validator/blob/master/_examples/struct-level/main.go

Golang gRPC database serialization key format defined on struct

I want to use the go structs that are generated by the gRPC compiler directly for database transactions but the problem is that only the json serialization field is set by gRPC.
Is there a way to either set additional serialization keys (like shown below) or is there another golang specific way to tell the database driver (sqlx on top of database/sql) that the json key format should be used?
Some example - The gRPC compiler creates the following struct:
type HelloWorld struct {
TraceId string `protobuf:"bytes,1,opt,name=trace_id,json=traceId,proto3" json:"trace_id,omitempty"`
...
What I would like to have:
type HelloWorld struct {
TraceId string `db:"trace_id" protobuf:"bytes,1,opt,name=trace_id,json=traceId,proto3" json:"trace_id,omitempty"`
...
A temporary workaround would be to write sql queries that use aliases (traceid instead of trace_id in this example) but it doesn't feel consistent and adds a lot of complexity.
I think that currently there is no built-in way of doing this. However, you might be interested in following this thread: https://github.com/golang/protobuf/issues/52
Other than that I think you can just create yet another struct for database access and make the mapping explicit which might be more readable.

Get a value of an interface using reflect

I'm trying to get a value from an interface (it should be a pointer to another struct, see).
Actually, I have to know the type of the pointer to access to the data it and it bother me. That's why I would like to use reflect package to do it better.
Actually, I can display the type using reflect.TypeOf(event.Event).String() but I don't understand how to change this line in the example without specifying with something like .(*marathon.EventDeploymentStepSuccess)

Get names of structs that implement an interface or inherit a struct

Is it possible to get a slice of strings that represent the names of all types that implement an interface or inherit from a specific struct in a specific package using reflection?
After some research on the reflect package's doc, I don't think it's possible. That's not the way reflection work in go: the interfaces mechanism not beeing declarative (but duck-typed instead), there is no such list of types.
That said, you may have more luck using the ast package to parse your project, get the list of types, and check wheter or not they implement an interface, then write some code to give you the said slice. That would add a step to compilation, but could work like a charm.
AFAIK, you can't do this with reflect, since packages are kinda out of reflect's scope.
You can do this the same way godoc's static analysis works. That is, using code.google.com/p/go.tools/go/types to parse the package's source code and get the type info.
The go oracle can do this. https://godoc.org/code.google.com/p/go.tools/oracle
Here is the relevant section of the user manual.

Resources