parsing query values to substruct using pgx - go

Prefacing this that im very new with Go and pgx. Is there a way to easily parse substruct values from a query?
So I do a Join query like:
rows, err := conn.Query(context.Background(),
`SELECT
rdr.id AS r_Id,
rdr.field1 AS r_Field2,
rdr.field2 AS r_Field2,
contact.firstname AS c_FirstName,
contact.lastname AS c_LastName,
contact.id AS c_Id
FROM rdr
INNER JOIN contact ON rdr.contact=contact.id`)
Then my structs are
type Rdr struct {
Id int32
Field1 int32
Contact Contact
Field2 time.Time
}
type Contact struct {
Id int
FirstName string
LastName string
}
How can I parse the data returned in the rows into my Rdr struct. I would need to parse the correct fields into the Contact, ideally at the same time, but i dont see how to do that. I cant see how Scan would do it. We were looking at the FieldDescriptions and the Values but it gets very messy. Is there an easy/common way to do this? I imagine its a very common issue.

Related

Group by column and scan into custom struct

I have the following model. Each instance is part of a group which is only defined as the string GroupName here because the actual group is defined in a different service using a different database.
type Instance struct {
gorm.Model
UserID uint
Name string `gorm:"index:idx_name_and_group,unique"`
GroupName string `gorm:"index:idx_name_and_group,unique"`
StackName string
DeployLog string `gorm:"type:text"`
Preset bool
PresetID uint
}
I'd like to scan, the above model, into the following struct. Thus grouping instances why their group name.
type GroupWithInstances struct {
Name string
Instances []*model.Instance
}
I'm been trying my luck with the following gorm code
var result []GroupWithInstances
err := r.db.
Model(&model.Instance{}).
Where("group_name IN ?", names).
Where("preset = ?", presets).
Group("group_name").
Scan(&result).Error
indent, _ := json.MarshalIndent(result, "", " ")
log.Println(string(indent))
But I'm getting the following error
ERROR: column "instances.id" must appear in the GROUP BY clause or be used in an aggregate function (SQLSTATE 42803)
I'm not sure how to deal with that since I don't want to group by instances but rather their groups.
The error indicates that your RDBMS is in Full Group By Mode - cannot select field that isn't in group by clause or used in an aggregate function (SUM, AVG,...). There are 2 solutions:
Disable Full Group By mode. Example in MySQL
Modify the query
Even when we go with Solution 1, gorm will throw another error about relationship between GroupWithInstances and Instance.
So I think we should review the feature and go with Solution 1 - only select what is needed.

insert rows with sqlx, is it possible to avoid listing the fields explicitly?

I'm using sqlx library like the following:
type MyData {
Field1 string `json:"db:field1"`
Field2 string `json:"db:field2"`
}
...
res, err := db.Exec("INSERT INTO XXX (field1, field2) VALUES (?, ?)", data.XX, data.XX)
Is it possible to avoid specifying "field1, field2" explicitly, and let sqlx create the query and binding automatically, e.g
db.Exec2("table_name", data)
I didn't find a relevant method to do that.
No. The package sqlx is just a wrapper around the DB driver. It is not an ORM. It expects user-supplied, valid SQL query strings. You may construct the query strings programmatically, and design some method that builds the column names from struct tags using reflect package, but why reinventing the wheel? The advantage of using a lower-level package as sqlx is exactly to maintain your own queries in exchange for not introducing an ORM framework in your codebase. With an ORM instead it is the other way around.
Gorm instead has a set of conventions that do what you want out of the box. You wouldn't even have to specify the db tags:
GORM prefer convention over configuration, by default, GORM uses ID as primary key, pluralize struct name to snake_cases as table name, snake_case as column name, and uses CreatedAt, UpdatedAt to track creating/updating time
type MyData {
Field1 string `json:"field1"`
Field2 string `json:"field2"`
}
data := MyData {
Field1: "foo",
Field2: "bar",
}
result := db.Create(&data)
// results in something like this:
// INSERT INTO my_datas (field1, field2) VALUES ('foo', 'bar')

"Creation At" time in GORM Customise Join table

I am trying to customize many2many table join. I have two tables from which I want to have taken the ids and want another field, which will tell me when the entry in the join table was made. The ids are coming fine, but the "created_at" is not updating and shows "Null" instead of time.
// this is the table join struct which I want to make
type UserChallenges struct {
gorm.JoinTableHandler
CreatedAt time.Time
UserID int
ChallengeID int
}
//hook before create
func (UserChallenges) BeforeCreate(Db \*gorm.DB) error {
Db.SetJoinTableHandler(&User{}, "ChallengeId", &UserChallenges{})
return nil
}
This is not giving any error on the build. Please tell me what I am missing so that I can get the creation time field in this.
PS - The documentation of GORM on gorm.io is still showing SetupJoinTable method but it is deprecated in the newer version. There is a SetJoinTableHandler but there is no documentation available for it anywhere.
The thing to get about using a Join Table model is that if you want to access fields inside the model, you must query it explicitly.
That is using db.Model(&User{ID: 1}).Association("Challenges").Find(&challenges) or db.Preload("Challenges").Find(&users), etc. will just give you collections of the associated struct and in those there is no place in which to put the extra fields!
For that you would do:
joins := []UserChallenges{}
db.Where("user_id = ?", user.ID).Find(&joins)
// now joins contains all the records in the join table pertaining to user.ID,
// you can access joins[i].CreatedAt for example.
If you wanted also to retrieve the Challenges with that, you could modify your join struct to integrate the BelongsTo relation that it has with Challenge and preload it:
type UserChallenges struct {
UserID int `gorm:"primaryKey"`
ChallengeID int `gorm:"primaryKey"`
Challenge Challenge
CreatedAt time.Time
}
joins := []UserChallenges{}
db.Where("user_id = ?", user.ID).Joins("Challenge").Find(&joins)
// now joins[i].Challenge is populated

Decode spanner row fields to nested structs golang

I have two structs
type Row struct{
ID string
Status string
details Details
}
type Details struct{
SessionID string
Location string
Project string
}
And I get data like this
Select a.ID, a.Status, b.SessionID, b.Location, b.Project from table1 as a left join table2 as b on a.ID == b.SessionID
Now, to get all the select info in a row I need to change the struct and add the fields in it instead of a struct(Details) which is essentially duplicating fields in Row and Details (which I need for other purpose as well).
type Row struct{
ID string
Status string
SessionID string
Location string
Project string
}
r := Row{}
spanner.row.ToStruct(&r) // this works
but is there a simplified way to get the data without having to duplicate the fields in the struct or specifying each field in spanner.row.Column? I read that spanner.row.ToStruct does not support destination struct to have a struct as field as that's not a supported column type, but what's a better workaround?
Thanks!
I have not worked on the google cloud scanner directly but from the Go language point of view, How about embedding the Structs?
i.e.:
type Row struct{
ID string
Status string
Details
}
type Details struct{
SessionID string
Location string
Project string
}
r := Row{}
spanner.row.ToStruct(&r)

Jmoiron SQLX Golang common interface

I am new to golang and using Jmoiron Sqlx package for querying the Postgres Database(select query) . The waY I am doing is creating a sql string and calling Select(dest interface{}, query string,args) method. While it works well , the problem is I am generating my sql String dynamically and as such the destination structure should be different for each response .
For ex : - One query can be
Select a,b,c,d from table A ;
the other can be
Select x,y,z from Table B;
From what i understand , there should be two different structs defined for Select Method to work i.e.
Struct Resp1{
a string
b string
c string
d string
}
And,
Struct Resp2{
x string
y string
z string
}
And then invoke select as db.Select(&resp1,query,args) and db.Select(&resp2,query,args)
I am thinking if its possible for me to define a common Struct
say Resp3{
a string
b string
c string
d string
x string
y string
z string
}
And based on my select query populates only the matching columns (i.e only a,b,c,d for first and x,y,z for second) .
I tried searching but couldnt get any leads .
I could not get answer to this here and since I needed this , I dug up myself and I found how to solve this in an efficient way .
To solve this one can define all your String values as sql.NullString , integer as sql.int64 , float as sql.float64 etc .
So Say your table is having columns a,b,c,d,e,f and for some response you only have to display a,b,d for some other d,e and so on . Instead of creating different structures and mapping them in db.Select(...) statement Just define ur struct as follows
a sql.NullString `json:"whatever u wish to have as key,omitempty"`
b sql.NullString `json:"b,omitempty"`
c sql.NullString `json:"c,omitempty"`
d sql.int64 `json:"d,omitempty"`
e sql.float64 `json:"e,omitempty"`
Remember sql.NullString will be Marshalled to json with an additional key displayed (Valid:boolean) . You can follow approach here to fix that How can I work with sql NULL values and JSON in Golang in a good way?
Hope this is helpful to someone.!!
Usually your struct should represent all fields of SQL table, not only fields that you are fetching in SELECT, so you can just do SELECT * FROM... and deserialize response from db to your struct Resp3.

Resources