gorp PreUpdate method update involuntary columns - go

see below.
type User struct {
Id int64 `db:"id" json:"id"`
Name string `db:"name" json:"name"`
DateCreate int64 `db:"date_create"`
DateUpdate int64 `db:"date_update"`
}
func (u *User) PreInsert(s gorp.SqlExecutor) error {
u.DateCreate = time.Now().UnixNano()
u.DateUpdate = u.DateCreate
return nil
}
func (u *User) PreUpdate(s gorp.SqlExecutor) error {
u.DateUpdate = time.Now().UnixNano()
return nil
}
I executed INSERT.
user := model.User{
Name: "John",
}
err := dbMap.Insert(&user)
Result of INSERT. no problem
1,John,1444918337049394761,1444918337049394761
continue, I executed UPDATE.
user := model.User{
Id: 1,
Name: "John",
}
_, err := dbMap.Update(&user)
Result of UPDATE
1,John,0,1444918337049394900
Column DateCreate updated.
Of course, my expectation values are
1,John,1444918337049394761,1444918337049394900

That's because when initializing your user struct, you did not explicitely set user.DateCreate, which effectively sets it to 0.
gorp cannot guess which fields you mean to update or not, so it updates them all.
To do what you want, you have to choose between trade-offs
Get the struct from the ORM, with the field already set to the good value. That's one additional select query that will be executed.
Execute a custom query, losing the convenience of the ORM
I'm guessing you could have another struct type without this field. That seems messy, I do not recommend that.

Related

how to handle INSERT on conflict in a has many association in GORM Create, ERROR (SQLSTATE 23505)

I have two models as follows:
type OHLCV struct {
gorm.Model
Interval string `gorm:"uniqueIndex:idx_ohlcv"`
Pair string `gorm:"uniqueIndex:idx_ohlcv"`
OpenTime time.Time `gorm:"uniqueIndex:idx_ohlcv"`
CloseTime time.Time `gorm:"uniqueIndex:idx_ohlcv"`
Open float64 `json:"open"`
High float64 `json:"high"`
Low float64 `json:"low"`
Close float64 `json:"close"`
Volume float64 `json:"volume"`
QuoteAssetVolume float64 `json:"quoteAssetVolume"`
NumberOfTrades float64 `json:"numberOfTrades"`
Calculations []Calculation `gorm:"foreignKey:OhlcvRefer"`
}
and
type Calculation struct {
gorm.Model
OhlcvRefer uint `gorm:"uniqueIndex:idx_calculation"`
Key string `gorm:"uniqueIndex:idx_calculation"`
Config string `gorm:"uniqueIndex:idx_calculation"`
Value float64
}
As you see both tables have unique indexes to prevent inserting duplicate data. The first table foreignKey is a part of the second table's unique index. The problem is how can I handle ON CONFLICT DO NOTHING behavior for both tables with a single GORM Create statement?
Before adding the Calculation association I was able handle CONFLICTS with
err = db.Clauses(clause.OnConflict{DoNothing: true,
Columns: []clause.Column{{Name: "interval"}, {Name: "pair"}, {Name: "open_time"}, {Name: "close_time"}},
}).Create(ohlcvs).Error
But now I get the following error:
ERROR: duplicate key value violates unique constraint "idx_calculation" (SQLSTATE 23505)
What I need is to DO NOTHING for the Calculation conflicts as well.
To achieve what you need, it should be enough to use the Unique index constraint on the two structs. Let's see how you can implement it.
package main
import (
"gorm.io/driver/postgres"
"gorm.io/gorm"
)
type User struct {
Id int
Name string `gorm:"uniqueIndex:idx_name"`
Posts []Post
}
type Post struct {
Id int
Title string `gorm:"uniqueIndex:idx_title"`
UserId int
}
func main() {
dsn := "host=localhost user=postgres password=postgres dbname=postgres port=5432 sslmode=disable"
db, err := gorm.Open(postgres.Open(dsn), &gorm.Config{})
if err != nil {
panic(err)
}
db.AutoMigrate(&Post{})
db.AutoMigrate(&User{})
db.Create(&User{
Name: "john doe",
Posts: []Post{
{Title: "first"},
{Title: "second"}, // to generate an error change to "first"
},
})
}
In this way, if you're entering duplicates value the db itself will block you. This is valid ono either on the users table and on the posts one. IMO, it's a very clean approach and you can be as flexible as you wish.
Let me know if this solves your issue or there is something else!

Doing an update to a database with partial data in Go

So I have the following function in an API:
type Product struct {
Id int32 `json:"id" db:"id"`
CompanyId int32 `json:"companyID" db:"company_id"`
Name *string `json:"name" db:"name"`
Description *string `json:"description" db:"description"`
ExternalId *string `json:"externalID" db:"external_id"`
Tags types.StringArray `json:"tags" db:"tags"`
Enabled bool `json:"enabled" db:"enabled"`
CreatedAt time.Time `json:"createdAt"`
UpdatedAt time.Time `json:"updatedAt"`
}
func updateProduct(event UpdateProductEvent) (Response, error) {
// TODO - determine how we want to log. fmt jsut does prints. The log package?
fmt.Println("This is updateProduct")
var id int64
err := db.QueryRow(`
UPDATE
v2.products
SET
company_id = $2,
name = $3,
description = $4,
external_id = $5,
enabled = $6
WHERE id = $1 RETURNING id`,
Product.Id,
Product.CompanyId,
Product.Name,
Product.Description,
Product.ExternalId,
Product.Enabled,
).Scan(&id)
if err != nil {
fmt.Println("Error updating product: " + err.Error())
fmt.Println("Product: ")
return Response{}, errors.New("Error inserting product")
}
return Response{
Id: 8,
}, nil
}
So a Product gets passed in and then I need to update the entity in the database. The above works fine in the case where I get a full product passed in. But for this API that might not be the case, it could only be a partial one.
So I might only have two fields in the JSON passed to the API.
So what I'm thinking is passing in an interface{}, instead of the Product so I dont get zero values for fields not set. Then looping through interface{} to get the fields. From those fields I would check if that field exists on the Product object. If it does get the tag from the field, this would be the appropriate column in the database, and then create the query string for the QueryRow call and add all the fields to an interface{} array and let that be the second param. Something like:
func updateProduct(event interface{}) (Response, error) {
var values []interface{}
// Loop through the event and get the field and value
for field, i := range event.fields {
if Product.Has(field.name) {
//add db tag of Product field to query
values = append(values, field.value)
}
}
err := db.QueryRow(`
UPDATE
v2.products
SET
//fields from the loop above as needed
WHERE id = $1 RETURNING id`,
values...,
).Scan(&id)
...
}
I think i can get this, or something like it, to work but it seems like a very round about way to do something very common. I just cant think of any other way to do it.So is there a more common/simpler approach that I'm missing. I cant find anything Googling either.
EDIT: In my haste and attempt to simplfy I left out an important part. This is for an AWS Lambda. So the object passed into updateProduct depends on what comes from the API Gateway. So if i make it the UpdateProductEvent it will parse the json into the object, in this case the Product. If I just make it an interface{} it will basically just make a map.
So if my request is
{
"product": {
"name": "Test17",
"externalID": "test17",
"description": "",
"companyID": 309
}
}
The first method, setting an explicit object, it will create a Product and fields not in the json will have the 0 value for their field.
If i make it an interface it will be an object with those properties and not the zero values for the rest of the fields in Product

Golang Pointer Understanding

I am learning go and i need to understand something. I am getting few errors. I have created a Product struct and attached a func with it. I also got a product lists as a slice. Actually I am following one example. I was just trying add different endpoints to it.
I have added question in comment in code. Please explain. I need to return the json single object as a response to user. Please guide me.
package data
type Product struct {
ID int `json:"id"`
Name string `json:"name"`
Description string `json:"description"`
Price float32 `json:"price"`
SKU string `json:"sku"`
CreatedOn string `json:"-"`
UpdatedOn string `json:"-"`
DeletedOn string `json:"-"`
}
type Products []*Product
func (p *Products) ToJSON(w io.Writer) error {
e := json.NewEncoder(w)
return e.Encode(p)
}
func (p *Product) FromJSON(r io.Reader) error {
d := json.NewDecoder(r)
return d.Decode(p)
}
var ErrProductNotFound = fmt.Errorf("Product not found")
func GetProduct(id int) (*Product, error) { // this is returning *Product & err. When I use this in GetProduct in handler func it is giving error
for _, p := range productList {
if p.ID == id {
fmt.Println(p)
return p, nil
}
}
return nil, ErrProductNotFound
}
var productList = []*Product{ **// Why in example the teacher doing it like this.** []*Product{&Product{}, &Product{}} **what it the reason? Please explain.
&Product{ // this gives warning : redundant type from array, slice, or map composite literal. need to understand why**
ID: 1,
Name: "Latte",
Description: "chai",
Price: 2.45,
SKU: "abc123",
CreatedOn: time.Now().UTC().String(),
UpdatedOn: time.Now().UTC().String(),
},
&Product{
ID: 2,
Name: "Tea",
Description: "chai",
Price: 1.45,
SKU: "abc1234",
CreatedOn: time.Now().UTC().String(),
UpdatedOn: time.Now().UTC().String(),
},
}
package handlers
func (p *Product) GetProduct(rw http.ResponseWriter, r *http.Request) {
vars := mux.Vars(r)
id, _ := strconv.Atoi(vars["id"])
p, errr := data.GetProduct(id) **// cannot use data.GetProduct(id) (value of type *data.Product) as *Product value in assignment**
errr = p.ToJSON(rw) // **p.ToJSON undefined (type *Product has no field or method ToJSON)**
if errr != nil {
http.Error(rw, "could not locate the product", http.StatusBadGateway)
}
}
cannot use data.GetProduct(id) (value of type *data.Product) as *Product value in assignment
p.ToJSON undefined (type *Product has no field or method ToJSON)
The problem here is that inside the GetProduct handler the variable p already has a type (*handlers.Product) that is different from the one returned by the data.GetProduct function (*data.Product). So what you can do is to use a different name for the variable that will store the result of data.GetProduct.
Why in example the teacher doing it like this. []*Product{&Product{}, &Product{}} what it the reason? Please explain.
In general because that's one of the available methods for how to initialize a slice of structs, as per the language spec. Why the teacher used this method specifically? Unless the teacher confided the reason to someone, then no one would know, I certainly don't.
this gives warning : redundant type from array, slice, or map composite literal. need to understand why
Because it's true, it is redundant, as per the language spec, in a composite literal expression you can elide the types of the elements and keys.
For example a non-redundant version of the following composite literal:
[]*Product{&Product{}, &Product{}}
would look like this:
[]*Product{{}, {}}
and the result of these two expressions would be the same.

Creating structs programmatically at runtime - possible?

Is it possible in Go to create a struct type programmatically (i.e. not in the compiled source code)?
We have a particular use case where a type will be created via user-defined metadata (so the schema/types are not known in advance)
and will vary for every customer. We would then need to auto-generate REST services for those and persist them in a NoSQL backend.
We would also need to define different dynamic validators per field (e.g. mandatory, regex, max/min size, max/min value, a reference to another type instance, etc.)
I was wondering if something similar is possible in the Go?
Edit 1:
For example
From frontend in JSON
For customer 1:
{
"id":1,
"patientid":1,
"name":"test1",
"height":"160",
"weight":"70",
"temp":"98"
}
For customer 2:
{
"id":2,
"patientid":1,
"name":"test1",
"height":"160",
"weight":"70"
}
For customer 3
may be different new fields will add
Backend
// For One customer need to have these fields
type Vitalsigns struct {
ID int64 `datastore:"-"`
PatientID int64 `json:"patientid,omitempty"`
Name string `json:"name,omitempty"`
Height string `json:"height,omitempty"`
Weight string `json:"weight,omitempty"`
Temp string `json:"temp,omitempty"`
}
// Another need to have these fields
type Vitalsigns struct {
ID int64 `datastore:"-"`
PatientID int64 `json:"patientid,omitempty"`
Name string `json:"name,omitempty"`
Height string `json:"height,omitempty"`
Weight string `json:"weight,omitempty"`
}
//CreateVitalsignsHandler is to create vitals for a patient
func CreateVitalsignsHandler(w http.ResponseWriter, r *http.Request, ps httprouter.Params) {
//Creating the Vitalsigns
kinVitalsigns := &Vitalsigns{}
ctx := appengine.NewContext(r)
if err := json.NewDecoder(r.Body).Decode(kinVitalsigns); err != nil {
RespondErr(w, r, http.StatusInternalServerError, err.Error())
return
}
//Getting namespace
namespace := ps.ByName("namespace")
ctx, err := appengine.Namespace(ctx, namespace)
if err != nil {
log.Infof(ctx, "Namespace error from CreateVitalsignsHandler")
RespondErr(w, r, http.StatusInternalServerError, err.Error())
return
}
//Geting the patientID
pID, err := strconv.Atoi(ps.ByName("id"))
if err != nil {
log.Infof(ctx, "Srting to Int64 conversion error from CreateVitalsignsHandler")
RespondErr(w, r, http.StatusInternalServerError, err.Error())
return
}
patientID := int64(pID)
kinVitalsigns.PatientID = patientID
//Generating the key
vitalsignsKey := datastore.NewIncompleteKey(ctx, "Vitalsigns", nil)
//Inserting the data to the datastore
vk, err := datastore.Put(ctx, vitalsignsKey, kinVitalsigns)
if err != nil {
log.Infof(ctx, "Entity creation was failed from CreateVitalsignsHandler")
RespondErr(w, r, http.StatusInternalServerError, err.Error())
return
}
kinVitalsigns.ID = vk.IntID()
message := "Vitalsigns created successfully!! "
Respond(w, r, http.StatusOK, SuccessResponse{kinVitalsigns.ID, 0, "", message})
return
}
Edit: Your edit reveals you want to handle dynamic objects to put / retrieve from Google Datastore. For this it is completely unnecessary to create struct types at runtime, you may just use a dynamic map presented in this answer: How can I have dynamic properties in go on the google app engine datastore.
Original answer follows.
Note that if the types are known at compile time, best / most efficient is to create the types and compile them, so everything will be "static". You may create the types manually, or you may use go generate to automate the process.
Also note that you may not necessarily need struct types to model dynamic objects, many times maps may be sufficient.
If types are not known at compile time, and struct types are a must, read on.
Yes, it's possible to create "dynamic" struct types at runtime using Go's reflection, specifically with the reflect.StructOf() function.
Let's see a simple example, creating a struct type at runtime that has a Name string and an Age int field:
t := reflect.StructOf([]reflect.StructField{
{
Name: "Name",
Type: reflect.TypeOf(""), // string
},
{
Name: "Age",
Type: reflect.TypeOf(0), // int
},
})
fmt.Println(t)
v := reflect.New(t)
fmt.Printf("%+v\n", v)
v.Elem().FieldByName("Name").Set(reflect.ValueOf("Bob"))
v.Elem().FieldByName("Age").Set(reflect.ValueOf(12))
fmt.Printf("%+v\n", v)
This outputs (try it on the Go Playground):
struct { Name string; Age int }
&{Name: Age:0}
&{Name:Bob Age:12}
If you want to define validation rules, you may use a 3rd party lib for that, for example github.com/go-validator/validator. This package uses struct tags to specify validation rules, struct tags which you may also specify using reflection.
For example, if you want to specify that the Name must be at least 3 characters and 40 at most, and it may only contain letters of the English alphabet, and valid range for Age is 6..100 (both inclusive), this is how it would look like:
t := reflect.StructOf([]reflect.StructField{
{
Name: "Name",
Type: reflect.TypeOf(""), // string
Tag: reflect.StructTag(`validate:"min=3,max=40,regexp=^[a-zA-Z]*$"`),
},
{
Name: "Age",
Type: reflect.TypeOf(0), // int
Tag: reflect.StructTag(`validate:"min=6,max=100"`),
},
})
Printing this type would output (wrapped by me) (try it on the Go Playground):
struct { Name string "validate:\"min=3,max=40,regexp=^[a-zA-Z]*$\"";
Age int "validate:\"min=6,max=100\"" }
Once you create an instance of this struct, you can validate it using the validator.Validate() function, e.g.:
v := reflect.New(t)
v.Elem().FieldByName("Name").Set(reflect.ValueOf("Bob"))
v.Elem().FieldByName("Age").Set(reflect.ValueOf(12))
if errs := validator.Validate(v.Elem().Interface()); errs != nil {
// values not valid, deal with errors here
}

Mapping struct to mysql table, and binding row to struct

This is my first script using go-sql-driver.
My mysql table (PRODUCT) looks like:
id int
name varchar(255)
IsMatch tinyint(1)
created datetime
I want to simply load a row from a table, and bind it to a struct.
I have this so far:
package main
import (
"database/sql"
"fmt"
_ "github.com/go-sql-driver/mysql"
)
type Product struct {
Id int64
Name string
IsMatch ??????????
Created ?????
}
func main() {
fmt.Printf("hello, world!\n")
db, err := sql.Open("mysql", "root:#/product_development")
defer db.Close()
err = db.Ping()
if err != nil {
panic(err.Error()) // proper error handling instead of panic in your app
}
rows, err := db.Query("SELECT * FROM products where id=1")
if err != nil {
panic(err.Error()) // proper error handling instead of panic in your app
}
}
Now I need to:
1. What datatype in Go do I use for tinyint and datetime?
2. How to I map the rows to a Product struct?
What datatype in Go do I use for tinyint and datetime?
For a hint as to the types that the database/sql package will be using, have a look at the documentation for database/sql.Scanner, which lists the Go types used within database/sql itself:
int64
float64
bool
[]byte
string
time.Time
nil - for NULL values
This would lead you to try int64 for IsMatch and time.Time for Created. I believe in reality you can use pretty much any sized int (maybe even bool, you'd have to check the source) for IsMatch because it can be stored "without loss of precision." The documentation for go-mysql-driver explains that you will need to add parseTime=true to your DSN in order for it to parse into a time.Time automatically or use NullTime.
How to I map the rows to a Product struct?
It should be something pretty strightforward, using Rows.Scan, like:
var products []*Product
for rows.Next() {
p := new(Product)
if err := rows.Scan(&p.ID, &p.Name, &p.IsMatch, &p.Created); err != nil { ... }
products = append(products, p)
}
if err := rows.Err() { ... }
This scans the columns into the fields of a struct and accumulates them into a slice. (Don't forget to Close the rows!)
How to I map the rows to a Product struct?
You can use reflect to bind table rows in db to a struct, and
automatically match values without long Hard-Code sql string which is easy to make mistakes.
this is a light demo: sqlmapper

Resources