Gorm WHERE clause not working on Preload data - go

This query here below works, where the where clause works becuase it's using values from ResourceUsage struct, but I would love to be able to do something like this, where the where clause is using value from the ResourceMetadata struct.
db.Preload("ResourceMetadata").Where("resource_type = ?", resourceType).Where("timestamp BETWEEN ? AND ?", start, end).Limit(10).Find(&resourceUsage)
But it throws exception:
2019-12-03 11:06:12] Error 1054: Unknown column 'resource_type' in 'where clause'
Code:
// ResourceUsage describes the storage model for resource-usage
type ResourceUsage struct {
ID int64 `json:"-"`
DetailsID int64 `json:"details_id"`
ResourceMetadata *ResourceMetadata `gorm:"foreignkey:DetailsID;association_foreignkey:ID"`
MeasuredType string `json:"measured_type"`
Quantity float64 `json:"quantity"`
Timestamp int64 `json:"timestamp"`
}
// ResourceMetadata describes the storage model for resource-usage
type ResourceMetadata struct {
ID int64 `json:"-"`
ResourceUUID string `json:"resource_uuid"`
ResourceName string `json:"resource_name"`
ResourceDisplayName string `json:"resource_display_name"`
ResourceType string `json:"resource_type"`
}
db.Preload("ResourceMetadata").Where("timestamp BETWEEN ? AND ?", start, end).Limit(10).Find(&resourceUsage)

Can you use a JOIN?
resourceUsages := []ResourceUsage{}
if err := db.Joins("JOIN resource_metadata ON resource_metadata.id=details_id").
Where("resource_metadata.resource_type = ? AND timestamp BETWEEN ? AND ?", resourceType, start, end).
Preload("ResourceMetadata").Limit(10).
Find(&resourceUsages).Error; err != nil {
// ... handle error ...
}

Related

Unable to scan type of uuid.UUID into UUID in sqlmock Golang

I have the following test function.
func (s *Suite) Test_delete() {
var (
id = uuid.New()
name = "test"
quantity = 2
)
s.mock.ExpectQuery(regexp.QuoteMeta(
`SELECT * FROM "items" WHERE code = $1 AND "items"."deleted_at" IS NULL`,
)).
WithArgs(id).
WillReturnRows(sqlmock.NewRows([]string{"code", "name", "quantity"}).
AddRow(id, name, quantity))
err := s.repository.DeleteItem(id.String())
require.NoError(s.T(), err)
}
Now the problem is trying to scan the id variable into the row for the column code. In my Product struct, I have code defined as follows.
type Item struct {
Code uuid.UUID `gorm:"type:uuid;primaryKey" json:"code"`
Name string `json:"name" validate:"required"`
Quantity int `json:"qty" validate:"required,min=0"`
DeletedAt gorm.DeletedAt `json:"-" gorm:"index"`
}
func (i *Item) BeforeCreate(tx *gorm.DB) (err error) {
if i.Code == uuid.Nil {
i.Code = uuid.New()
}
return
}
Now the problem is when I try to run the test function, it is somehow unable to scan uuid.UUID into UUID even though they are both the same types. Here is the exact error message.
msql: Scan error on column index 0, name "code": Scan: unable to scan type uuid.UUID into UUID
Could someone help me on this part?
Read this. I assume there is type casting error. Changing WillReturnRows(sqlmock.NewRows([]string{"code", "name", "quantity"}).AddRow(id, name, quantity))->WillReturnRows(sqlmock.NewRows([]string{"code", "name", "quantity"}).AddRow(id.String(), name, quantity)) may help.

Query JSONB columns with GORM for operator in

I have a table like below
INSERT INTO switches VALUES ('33', 60, jsonb_build_object('IDs',jsonb_build_array('11', '2'),'ID', '33', 'Name', 'switch1'));
I have a struct/model
type switches struct {
ID string `gorm:"primary_key; unique; json:id"`
Generation uint32 `gorm:"type:integer; column:generation;" json:"generation" `
// Additional gorm type jsonb set, if not present causes field to be bytea
SwitchAttrs []byte `sql:"type:jsonb; not null; column:Switch_attrs;" gorm:"type:jsonb; json:switchAttrs"`
}
I can query in postgres
SELECT * FROM switches WHERE switch_attrs->'IDs' ? '2'
id | generation | data
----+-------+------------------
33 | 60 | {"IDs": ["33","2"] }
How do I construct a query on the jsonB column for a particular key(s)? I was not able to find any documentation for using model objects to query which explains how to use "in" operator. I understand its possible to do with raw query, but wanted to see how it can be done using model object. I am trying to make queries like below but it fails :(
db = db.Where("Switch_attrs->'IDs' ?", "2").Find(&switches)
OR
db = db.Where("Switch_attrs->'IDs' ?", []string{"2"}).Find(&switches)
OR
db = db.Where("Switch_attrs->'IDs' IN ?", []string{"2"}).Find(&switches)
OR
db = db.Where("Switch_attrs->>'IDs' ?", "2").Find(&switches) . Note that i am querying as switch_attrs->>'IDs' . i expect the o/p as text and hence passing the value as "2".
for the last query i keep getting error as
"Severity": "ERROR", "Code": "42601", "Message": "syntax error at
or near "$1"",
Redefine your model as below
type switches struct {
ID string `gorm:"primary_key; unique; json:id"`
Generation uint32 `gorm:"type:integer; column:generation;" json:"generation" `
// Additional gorm type jsonb set, if not present causes field to be bytea
SwitchAttrs SwitchAttr `sql:"type:jsonb; not null; column:Switch_attrs;" gorm:"type:jsonb; json:switchAttrs"`
}
Add this for SwitchAttr
type SwitchAttr struct {
data struct {
ID *int `json:"id"`
Generation *string `json:"generation"`
IDs []string `json:"IDs"`
} `json:"data"`
}
// Column JSONB field
func (a SwitchAttr) Value() (driver.Value, error) {
return json.Marshal(a)
}
// simply decodes a JSON-encoded value into the struct fields.
func (a *SwitchAttr) Scan(value interface{}) error {
b, ok := value.([]byte)
if !ok {
return errors.New("type assertion to []byte failed")
}
return json.Unmarshal(b, &a)
}
For Query,
db = db.Where("switchAttrs #> '{"data":{"IDs":<pass_your_array_of_string>}}'::jsonb").Find(&switches)

"doesn't have relation" from go-pg

I am trying to pull data for CountyEntity that related with CityEntity:
type CityEntity struct {
tableName struct{} `pg:"city,alias:ci,discard_unknown_columns"`
Id string `pg:"id"`
Name string `pg:"name"`
}
type CountyEntity struct {
tableName struct{} `pg:"county,alias:co,discard_unknown_columns"`
Id int64 `pg:"id"`
Name string `pg:"name"`
DefaultTargetXDockId int64 `pg:"defaulttargetxdockid"`
MapsPreference string `pg:"mapspreference"`
EmptyDistrictAllowed bool `pg:"empty_district_allowed"`
CityId int64 `pg:"cityid"`
City *CityEntity `pg:"fk:cityid"`
}
My query is:
db, _ := repository.pgRepository.CreateDBConnection(.....)
var counties []model.CountyEntity
err := db.Model(&counties).
Join("inner join test.city ci on ci.id = co.cityid").
Relation("City").
Where("co.cityid = ?", cityId).
Select()
return counties, err
it throws that:
model=CountyEntity does not have relation="City"
But actually, I have the relation between city and county table on the database.
Db Relation Image
I tried different ways to solve but I couldn't solve it. Does anybody have an idea about what is the root cause of it and what is the possible solutions?
Pretty old and I'm sure you've figured it out by now, but for anyone else that stumbles across this your model should look like this
type CountyEntity struct {
tableName struct{} `pg:"county,alias:co,discard_unknown_columns"`
Id int64 `pg:"id"`
Name string `pg:"name"`
DefaultTargetXDockId int64 `pg:"defaulttargetxdockid"`
MapsPreference string `pg:"mapspreference"`
EmptyDistrictAllowed bool `pg:"empty_district_allowed"`
CityId int64 `pg:"cityid"`
City *CityEntity `pg:"rel:has-one,fk:cityid"`
}
Notice the "rel:has-one" added to the city column. You can find more information about all of these here: https://pg.uptrace.dev/models/

How to convert sqlx query results to an array of structs?

I am trying to query all the results from a postgres table without where condition and map it with array of structs with the help of sqlx db Query by passing the args ...interface {}.
But the code pasted below never works, Instead of iterating and scanning the result one by one , is it possible to get the following code work ??
Inputs are much appreciated . Thank you
type CustomData struct {
ID string `db:"id" json:",omitempty"`
Name string `db:"name" json:",omitempty"`
Description string `db:"description" json:",omitempty"`
SourceID string `db:"sourceid" json:",omitempty"`
StatusID string `db:"statusid" json:",omitempty"`
StatusReason string `db:"statusreason" json:",omitempty"`
CreateTime string `db:"createtime" json:",omitempty"`
UpdateTime string `db:"updatetime" json:",omitempty"`
}
var myData []CustomData
*sqlx.DB.Query("SELECT id as ID, name as Name, description as Description, sourceid as SourceID, statusid as StatusID, statusreason as StatusReason, createtime as CreateTime, updatetime as UpdateTime FROM myschema.my_table", &myData)
// tried with following statement but din't work either
// *sqlx.DB.Query("SELECT * FROM myschema.my_table", &myData)
for _, data := range myData {
fmt.Println("--", data)
}
Expected results:
--- CustomData{1,x,x,x,x}
--- CustomData{2,x,x,x,x}
Actual:
Nothing..
You don't need to rename the fields in the query, since you're defining the actual DB fields in the struct tags.
If you want to scan directly to the slice of CustomData and if you are using SQLX, you should use the SQLX specific Select method, rather than the generic SQL Query. Slightly modified relevant example from the illustrated guide to SQLX (https://jmoiron.github.io/sqlx/#getAndSelect):
pp := []Place{}
err = db.Select(&pp, "SELECT * FROM place")
So in your case:
myData := []CustomData
err = db.Select(&myData, "SELECT * FROM myschema.my_table")
you can use the following:
for rows.Next() {
s := CustomData{}
if err := rows.Scan(&s); err != nil {
return err
}
fmt.Println(s)
}
and you can always use ORM library as gorm if you like code first approach or sqlboiler if you like DB first approach

Golang: sqlx StructScan mapping db column to struct

i have my model struct like the below :
type Detail struct {
Product
Stocks
}
type Product struct {
Name string `db:"name"`
Id int `db:"id"`
}
type Stocks {
Name string `db:"name"`
Price float `db:"price"`
Type string `db:"type"`
}
i would have a query to join the above tables like the below :
query, args, err := sqlx.In("select p.name , s.price from Product p,Stocks s where p.name=s.name and type IN (?)",typecodes)
query = s.Cmd.Db.Rebind(query)
var rows *sqlx.Rows
rows, err = s.Cmd.Db.Queryx(query, args...)
for rows.Next() {
var p model.Detail
err = rows.StructScan(&p)
}
Would like to know when i do rows.StructScan(&p) will the Product structure Name field be populated or will there be any ambuigity found for the same since Stocks also have a Name field ?
Currently i am not getting any result for the above.But when i comment the Name field in the Stocks struct, i am getting the data.
Let me know what i am missing here.
For ambiguous fields you're best annotating them with a prefix of their struct name, e.g. product_name, stock_name, then alias them appropriately in your SQL statement.
I.e.
type Detail struct {
Product
Stocks
}
type Product struct {
Name string `db:"product_name"`
Id int `db:"id"`
}
type Stocks {
Name string `db:"stock_name"`
Price float `db:"price"`
Type string `db:"type"`
}
And in your SQL:
SELECT p.name AS product_name, s.name AS stock_name, ... FROM Product p, Stocks s WHERE ...

Resources