Dynamically generate struct fields from SQL QueryRow result - go

I want to retrieve all fields of the row and than render them to html. I know how to do it and here is a code for a row with 3 fields:
type View struct {
Id int
Name_and_requisits string
Reg_Date string
}
func getViewById(id int) (*View, error){
var vie View
row := db.QueryRow("select id, name_and_requisits, reg_date from book where id = ?;", id)
err := row.Scan(&vie.Id, &vie.Name_and_requisites, &vie.Reg_Date)
if err != nil {
return nil, err
}
return &vie, nil
}
But in my table one row includes about 20 columns and i need all of them with their names but i dont want to create a nasted hardcoded struct. I have an idea like to generate struct fields dynamically, from names of columns, and than use row.Scan on it. Any ideas? Maybe map is better for this situation?
Thanks!

generate struct fields dynamically
https://golang.org/pkg/reflect/#StructOf
But please: Don't do it.

Related

How can i get all rows in a gorm as reflect type slice

I want to write a function that returns all rows in a table using a reflective type.
func (rt *tableRouter) select_table_DB(ctx context.Context, vars url.Values, tableType reflect.Type, name string) (reflect.Value, error) {
db, err := db.Open(rt.dbcfg)
if err != nil {
return reflect.Value{}, err
}
defer db.Close()
rows := reflect.MakeSlice(reflect.SliceOf(tableType), 0, 0)
db.WithContext(ctx).Table(name).Find(&rows)
return rows, err
}
This only returns {}
You must provide a concret object (which contains a set of exported fields) to receive data from database so than gorm knows how database table columns map to object members. And reflect.Value is not such an object, maybe you could try to use reflect.Value.Interface() which returns an interface whose underlying data is an concret object. By the way, gorm/gen is a nice tool to automatically generate CURD code for your model. (enter link description here)

How to write Scan() for custom type in PostgreSQL column in Go

So I have this table Schema that looks like the following
CREATE TABLE artists (
id SERIAL PRIMARY KEY,
foo_user_id TEXT NOT NULL,
created_at TIMESTAMP DEFAULT CURRENT_TIMESTAMP NOT NULL,
time_span TIME_SPAN NOT NULL,
items artist [] NOT NULL
);
In addition, I also have the custom type created in PostgreSQL that is defined as follows
CREATE TYPE artist AS (
artist_name TEXT,
foo_artist_id TEXT,
artist_image_url TEXT,
artist_rank int
);
I am trying to query all rows that have the "foo_user_id" equal to what I pass into the function. Here is the sample code.
func GetHistoricalTopArtists(foo_user_id string) ([]TopArtists, error) {
// connect to DB etc..
// create prepared statement
stmtStr := `SELECT * FROM artists WHERE foo_user_id=$1`
// error check...
// iterate through all rows to get an array of []TopArtists
defer rows.Close()
for rows.Next() {
topArtist := new(TopArtists)
err := rows.Scan(&topArtist.Id, &topArtist.FooUserId, &topArtist.CreatedAt, &topArtist.TimeSpan, &topArtist.Artists)
if err != nil {
log.Fatalf("Something went wrong %v", err)
}
topArtists = append(topArtists, *topArtist)
}
}
To represent this data in Go I created the following structs
// Represents a row
type TopArtists struct {
Id int64 `json:"id" db:"id"`
FooUserId string `json:"foo_user_id" db:"foo_user_id"`
CreatedAt string `json:"created_at" db:"created_at"`
TimeSpan string `json:"time_span" db:"time_span"`
Artists []Artist `json:"items" db:"items"`
}
// Represents the artist column
type Artist struct {
ArtistName string `json:"artist_name"`
ArtistId string `json:"foo_artist_id"`
ArtistImageURL string `json:"artist_image_url"`
ArtistRank int `json:"artist_rank"`
}
When I call the function that does the query (the one I described above). I get the following error.
Scan error on column index 4, name "items": unsupported Scan, storing driver.Value type []uint8 into type *[]database.Artist.
I have a Value() function, but I am unsure how to implement a Scan() function for the array of the custom struct I have made.
Here is my Value() function, I have attempted to read documentation and similar posts on scanning arrays of primitive types (strings, int, etc) but I could not apply the logic to custom PostgreSQL types.
func (a Artist) Value() (driver.Value, error) {
s := fmt.Sprintf("(%s, %s, %s, %d)",
a.ArtistName,
a.FooArtistId,
a.ArtistImageURL,
a.ArtistRank)
return []byte(s), nil
}
#mkopriva - ...You need to declare a slice type, e.g. type ArtistSlice []Artist,
use that as the field's type, and implement the Value/Scan methods on
that.
Created custom Composite Types Artist in Postgresq has a strict struct as
{(david,38,url,1),(david2,2,"url 2",2)}
then you have to implement Value/Scan method with custom marshal/unmarshal algorithm
For example
type Artists []Artist
func (a *Artists) Scan(value interface{}) error {
source, ok := value.(string) // input example 👉🏻 {"(david,38,url,1)","(david2,2,\"url 2\",2)"}
if !ok {
return errors.New("incompatible type")
}
var res Artists
artists := strings.Split(source, "\",\"")
for _, artist := range artists {
for _, old := range []string{"\\\"","\"","{", "}","(",")"} {
artist = strings.ReplaceAll(artist, old, "")
}
artistRawData := strings.Split(artist, ",")
i, err := strconv.Atoi(artistRawData[1])
if err != nil {
return fmt.Errorf("parce ArtistRank raw data (%s) in %d iteration error: %v", artist, i, err)
}
res = append(res, Artist{
ArtistName: artistRawData[0],
ArtistId: artistRawData[1],
ArtistImageURL: artistRawData[2],
ArtistRank: i,
})
}
*a = res
return nil
}

Gorm query returning only a single row

We're trying to use Gorm with mysql 8 to much frustration.
I have the following tables (simplified for brevity here)
type StoragePool struct {
gorm.Model
PoolId string `json:"id" gorm:"column:poolid;size:40;unique;not null"`
Volumes []Volume `json:"volumes" gorm:"foreignkey:StorageId;association_foreignkey:PoolId"`
}
type Volume struct {
gorm.Model
StorageId string `json:"storageid" gorm:"column:storageid;size:40"`
}
Data insertions seem to work fine. Both tables get populated and no constraints are violated.
A query that expects a single record seems to work fine:
poolRecord := &StoragePool{}
if err := tx.Where("poolid = ?", pool.PoolId).First(&StoragePool{}).Scan(poolRecord).Error; err != nil {
return err
}
This query only returns a single row. When I perform this exact query as raw SQL outside of go, it returns all 31 records I expect.
var poolVolumes []Volume
if err := tx.Where("storageid = ?", pool.PoolId).Find(&Volume{}).Scan(&poolVolumes).Error; err != nil {
return err
}
log.Debugf("found %d volumes belonging to %q [%s]", len(poolVolumes), pool.Name, pool.PoolId)
According to the docs, that second sql statement is the equivalent of "SELECT * FROM VOLUMES WHERE STORAGEID = 'poolid'". That is not the behavior I am getting.
Anyone have any ideas what I'm doing wrong here?
I rarely use an ORM while coding with go, but following the doc from gorm, it seems like you are doing it the wrong way.
Scan is used for scanning result into another struct, like this:
type Result struct {
Name string
Age int
}
var result Result
db.Table("users").Select("name, age").Where("name = ?", 3).Scan(&result)
The correct way to get query results into a slice of structs should be:
var poolVolumes []Volume
if err := tx.Where("storageid = ?", pool.PoolId).Find(&poolVolumes).Error; err != nil {
return err
}

How to convert sqlx query results to an array of structs?

I am trying to query all the results from a postgres table without where condition and map it with array of structs with the help of sqlx db Query by passing the args ...interface {}.
But the code pasted below never works, Instead of iterating and scanning the result one by one , is it possible to get the following code work ??
Inputs are much appreciated . Thank you
type CustomData struct {
ID string `db:"id" json:",omitempty"`
Name string `db:"name" json:",omitempty"`
Description string `db:"description" json:",omitempty"`
SourceID string `db:"sourceid" json:",omitempty"`
StatusID string `db:"statusid" json:",omitempty"`
StatusReason string `db:"statusreason" json:",omitempty"`
CreateTime string `db:"createtime" json:",omitempty"`
UpdateTime string `db:"updatetime" json:",omitempty"`
}
var myData []CustomData
*sqlx.DB.Query("SELECT id as ID, name as Name, description as Description, sourceid as SourceID, statusid as StatusID, statusreason as StatusReason, createtime as CreateTime, updatetime as UpdateTime FROM myschema.my_table", &myData)
// tried with following statement but din't work either
// *sqlx.DB.Query("SELECT * FROM myschema.my_table", &myData)
for _, data := range myData {
fmt.Println("--", data)
}
Expected results:
--- CustomData{1,x,x,x,x}
--- CustomData{2,x,x,x,x}
Actual:
Nothing..
You don't need to rename the fields in the query, since you're defining the actual DB fields in the struct tags.
If you want to scan directly to the slice of CustomData and if you are using SQLX, you should use the SQLX specific Select method, rather than the generic SQL Query. Slightly modified relevant example from the illustrated guide to SQLX (https://jmoiron.github.io/sqlx/#getAndSelect):
pp := []Place{}
err = db.Select(&pp, "SELECT * FROM place")
So in your case:
myData := []CustomData
err = db.Select(&myData, "SELECT * FROM myschema.my_table")
you can use the following:
for rows.Next() {
s := CustomData{}
if err := rows.Scan(&s); err != nil {
return err
}
fmt.Println(s)
}
and you can always use ORM library as gorm if you like code first approach or sqlboiler if you like DB first approach

Mapping struct to mysql table, and binding row to struct

This is my first script using go-sql-driver.
My mysql table (PRODUCT) looks like:
id int
name varchar(255)
IsMatch tinyint(1)
created datetime
I want to simply load a row from a table, and bind it to a struct.
I have this so far:
package main
import (
"database/sql"
"fmt"
_ "github.com/go-sql-driver/mysql"
)
type Product struct {
Id int64
Name string
IsMatch ??????????
Created ?????
}
func main() {
fmt.Printf("hello, world!\n")
db, err := sql.Open("mysql", "root:#/product_development")
defer db.Close()
err = db.Ping()
if err != nil {
panic(err.Error()) // proper error handling instead of panic in your app
}
rows, err := db.Query("SELECT * FROM products where id=1")
if err != nil {
panic(err.Error()) // proper error handling instead of panic in your app
}
}
Now I need to:
1. What datatype in Go do I use for tinyint and datetime?
2. How to I map the rows to a Product struct?
What datatype in Go do I use for tinyint and datetime?
For a hint as to the types that the database/sql package will be using, have a look at the documentation for database/sql.Scanner, which lists the Go types used within database/sql itself:
int64
float64
bool
[]byte
string
time.Time
nil - for NULL values
This would lead you to try int64 for IsMatch and time.Time for Created. I believe in reality you can use pretty much any sized int (maybe even bool, you'd have to check the source) for IsMatch because it can be stored "without loss of precision." The documentation for go-mysql-driver explains that you will need to add parseTime=true to your DSN in order for it to parse into a time.Time automatically or use NullTime.
How to I map the rows to a Product struct?
It should be something pretty strightforward, using Rows.Scan, like:
var products []*Product
for rows.Next() {
p := new(Product)
if err := rows.Scan(&p.ID, &p.Name, &p.IsMatch, &p.Created); err != nil { ... }
products = append(products, p)
}
if err := rows.Err() { ... }
This scans the columns into the fields of a struct and accumulates them into a slice. (Don't forget to Close the rows!)
How to I map the rows to a Product struct?
You can use reflect to bind table rows in db to a struct, and
automatically match values without long Hard-Code sql string which is easy to make mistakes.
this is a light demo: sqlmapper

Resources