Serialize to JSON dynamic structure - go

All examples of working with JSON describe how to serialize to JSON simple or user types (like a struct).
But I have a different case: a) I don't know the fields of my type/object b) every object will have different types.
Here is my case in pseudocode:
while `select * from item` do
while `select fieldname, fieldvalue from fields where fields.itemid = item.id` do
...
For each entity in my database I get field names and field values. In the result I need to get something like this:
{
"item.field1": value,
...
"item.fieldN": value,
"custom_fields": {
"fields.field1": value,
...
"fields.fieldK": value
}
}
What is the best way to do it in Go? Is there any useful libraries or functions in standard library ?
Update: The source of data is the database. In the result i need to get JSON as string to POST it to external web service. So, the program just read data from database and make POST requests to REST service.

What exactly is your target type supposed to be? It can't be a struct since you do not know the fields beforehand.
The only fitting type to me seems to be a map of type map[string]interface{}: with it any nested structure can be achieved:
a := map[string]interface{}{
"item.field1": "val1",
"item.field2": "val2",
"item.fieldN": "valN",
"custom_fields": map[string]interface{}{
"fields.field1": "cval1",
"fields.field2": "cval2",
},
}
b, err := json.Marshal(a)
See playground sample here.
Filling this structure from a database as you hinted at should probably be a custom script (not using json).
Note: custom_fields can also be of other types depending on what type the value column is in the database. If the value column is a string use map[string]string.

Related

Implementing filters in ORM golang

I am working on an api which takes in parameters for filters (as given below)
/api/endpoint?filter_key_1=filter_value_1&...
I've previously worked on spring where the criteria API allows for dynamically building SQL queries without much hassle. In golang I'm using gorm for handling the ORM operations. Is there anyway to build the queries with optional parameters without writing redundant code?.
For example:
If the request sent is:
/api/endpoint?fk_1=fv_1&fk_2=fv_2&fk_3=fv_3
Query generated should be :
select * from table where fk_1 = fv_1 AND fk_2 = fv_2 AND fk_3 = fv_3
but in case of :
/api/endpoint?fk_1=fv_1
Query generated should be:
select * from table where fk_1 = fv_1
Currently my approach is to check if each variable is present and build the query as a string :
query:="select * from table where "
if fk_1 != ""{
query += "fk_1 = fv_1"
}
... and so on
but this seems very awkward and error prone
Any help will be appreciated! Thanks
EDIT
Building on #bjornaer's answer what helped me was to get the map[string][]string in a form that I can send the same to gorm, map[string]interface{}.
This thread will help in the same.
Now there's no longer a need for redundant checks or string operations in the filters
so it seems to me your question has 2 parts:
you need to retrieve your query values from your url and
insert them to your db query
I don't see how you are handling your requests so let's assume you use the http package: from req.URL you get the URL object and from that calling the Query() method yields a map[string][]string of your query parameters, with those in a variable URLQuery let's pause and look at how you query with gorm:
db, err := gorm.Open(sqlite.Open("gorm.db"), &gorm.Config{
QueryFields: true,
})
here I open a sqlite, then you can pass a variable reference to fill with your query, for example:
result := db.Where(map[string]interface{}{"name": "jinzhu", "age": 20}).Find(&users)
now from the example above, replace your variable in:
result := db.Where(map[string]interface{}URLQuery).Find(&users)
you can find it in the docs

How to read from Datastore using an Ancestor query and latest golang libraries

I want to read all entities from a Datastore kind (around 6 entities/records).
I have a Datastore that is key'ed on a weird type that I am trying to understand. I can't find any uniqueness on the a key to perform a query on.
The table looks like this:
GCP Datastore representing data I want to read into my Go app
When I click on a record, it looks like this:
Key literal exposed and used from here on out to try and get the records in the Go app
``I can perform an ancestor query in the console like this:```
GCP Datastore queried using Ancestor query
Great! So now I want to retrieve this data from my Golang App? But how?
I see a lot of solutions online about using q.Get(...) // where q is a *Query struct
Any of these solutions won't work because they import google.golang.org/appengine/datastore. I understand that this is legacy and deprecated. So I want a solution that imports cloud.google.com/go/datastore.
I tried something along these lines but didn't get much luck:
First try using GetAll and query
I tried this next:
Second try attempting to use ancestor query... not ready yet
Lastly I tried to get a single record directly:
Lastly I tried to get the record directly
In all cases, my err is not nil and the dts that should be populated from datastore query is also nil.
Any guidance to help me understand how to query on this key type? Am I missing something fundamental with the way this table is key'ed and queried?
Thank you
Then I tried this:
It seems you are just missing your Namespace
// Merchant Struct
type MerchantDetails struct {
MEID string
LinkTo *datastore.Key
Title string
}
// Struct array to store in
var tokens []MerchantDetails
// Ancestor Key to filter by
parentKey := datastore.NameKey("A1_1113", "activate", nil)
parentKey.Namespace = "Devs1"
// The call using the new datastore UI. Basically query.Run(), but datastore.GetAll()
keys, err := helpers.DatastoreClient.GetAll(
helpers.Ctx,
datastore.NewQuery("A1_1112").Ancestor(parentKey).Namespace("Devs1"),
&tokens,
)
if err != nil {
return "", err
}
// Print all name/id from the found values
fmt.Printf("keys: %v", keys)

How to query in GraphQL with no fixed input object contents?

I want to query to get a string, for which I have to provide a input object, however the input data can have multiple optional keys and quite grow large as well. So I just wanted something like Object as data type while defining schema.
// example of supposed schema
input SampleInput {
id: ID
data: Object // such thing not exist, wanted something like this.
}
type Query {
myquery(input: SampleInput!): String
}
here data input can be quite large so I do not want to define a type for it. is there a way around this?

golang gorp insert multiple records

Using gorp how can one insert multiple records efficiently? i.e instead of inserting one at a time, is there a batch insert?
var User struct {
Name string
Email string
Phone string
}
var users []Users
users = buildUsers()
dbMap.Insert(users...) //this fails compilation
//I am forced to loop over users and insert one user at a time. Error Handling omitted for brevity
Is there a better mechanism with gorp? Driver is MySQL.
As I found out on some other resource the reason this doesn't work is that interface{} and User{} do not have the same layout in memory, therefore their slices aren't of compatible types. Suggested solution was to convert []User{} into []interface{} in for loop, like shown here: https://golang.org/doc/faq#convert_slice_of_interface
There is still on caveat: you need to use pointers for DbMap.Insert()function.
Here's how I solved it:
s := make([]interface{}, len(users))
for i, v := range users {
s[i] = &v
}
err := dbMap.Insert(s...)
Note that &v is important, otherwise Insert will complain about non-pointers.
It doesn't look like gorp has anything that gives a wrapper for either raw SQL or multi value inserts (which is always SQL dialect dependent).
Are you worried about speed or transactions? If not, I would just do the inserts in a for loop.

How to save struct based type with a map property into mongodb

I want to use mongodb as session storage and save a struct based data type into mongodb.
The struct type looks like:
type Session struct {
Id string
Data map[string]interface{}
}
And create reference to Session struct type and put some data into properties like:
type Authen struct {
Name, Email string
}
a := &Authen{Name: "Foo", Email: "foo#example.com"}
s := &Session{}
s.Id = "555555"
s.Data["logged"] = a
How to save the session data s into mongodb and how to query those data back and save into a reference again?
I think the problem can occurs with the data property of type map[string]interface{}.
As driver to mongodb I would use mgo
There's nothing special to be done for inserts. Just insert that session value into the database as usual, and the map type will be properly inserted:
err := collection.Insert(&session)
Assuming the structure described, this will insert the following document into the database:
{id: "555555", data: {logged: {name: "foo", email: "foo#example.com"}}}
You cannot easily query it back like that, though, because the map[string]interface{} does not give the bson package a good hint about what the value type is (it'll end up as a map, instead of Authen). To workaround this, you'd need to implement the bson.Setter interface in the type used by the Data field.

Resources