Why isn't the variable overwritten? - go

If you write it this way
func showAll(db *gorm.DB) {
users := &[]models.User{}
card := models.Card{}
db.Find(users)
for _, i := range *users {
fmt.Println(i)
db.Where("user_id = ?", i.ID).Find(&card)
fmt.Println(card)
}
}
then fmt.Println(card) always prints the first value.
But if you write it this way
func showAll(db *gorm.DB) {
users := &[]models.User{}
db.Find(users)
for _, i := range *users {
fmt.Println(i)
card := models.Card{}
db.Where("user_id = ?", i.ID).Find(&card)
fmt.Println(card)
}
}
It prints correctly. Why? Shouldn't the &card variable be overwritten?
I wanted to print all the cards for found users.

Your first problem is that you aren't handling errors, if you were then you would know that all queries after the first one result in record-not-found. Second of your problems is the fact that you don't seem to be aware that gorm has debugging which, if you turn it on, will show the generated SQL and if you were to look at that you would immediately spot what's wrong. Finally, your actual problem, is the fact that after the first iteration the card struct instance has a non-zero ID which, alongside the explicit i.ID is also used in the WHERE clause.
So unless i.ID and card.ID are identical (or card.ID is zero) then:
db.Where("user_id = ?", i.ID).Find(&card).Error == ErrRecordNotFound
https://gorm.io/docs/query.html#String-Conditions
If the object’s primary key has been set, then condition query wouldn’t cover the value of primary key but use it as a ‘and’ condition. For example:
var user = User{ID: 10}
db.Where("id = ?", 20).First(&user)
// SELECT * FROM users WHERE id = 10 and id = 20 ORDER BY id ASC LIMIT 1
This query would give record not found Error. So set the primary key attribute such as id to nil before you want to use the variable such as user to get new value from database.

Related

How to use Go / GORM to print SELECT query output without pre-defined struct

I am developing an API using Go which connects to MySQL database for some query execution. Am using GORM for database operations. But am stuck at printing the SELECT query output for the tables which I don't have the column names.
My use case is that, I need to run the query on multiple tables where I don't have an idea about what their column names and types are. And so I cannot pre-define a struct for all the current and future tables which might get added.
Is there a way to print/save the SELECT query output without a pre-defined struct ?
I tried do some using empty struct but it didn't help me.
P.S: Am a beginner in Go
type Testing struct{}
var test Testing
dsn := fmt.Sprintf("%v:%v#tcp(%v:%v)/%v", myds.DBuser, myds.DBpassword, myds.DBhost, myds.DBport, myds.DBname)
db, err := gorm.Open(mysql.Open(dsn), &gorm.Config{})
if err != nil {
fmt.Println(err)
}
tx := db.Raw(query).Scan(&test)
if tx.Error != nil {
fmt.Println(tx.Error)
}
fmt.Println(test)
You can use an anonymous struct
Let's say you have a struct:
type User struct{
FirstName string
LastName string
}
Query:
SELECT CONCAT(first_name,last_name) AS full_name from users;
Notice the new column full_name
you can simply do
var fullName = struct{FullName string}{}
Notice how I use pascal case & FullName has to be the field name
A capital letter in between will represent a _
Field is public so it can be accessed outside.
full_name(query) = FullName(field)
pass this fullName object as a bucket to your Scan and it should work.
db.Raw(query).Scan(&fullName)
EDIT:
Your query will have some result right?
Let me assume that you have
column_one,column_two... column_n
Now, to get the data from all the columns or selected ones if you want, you simply have to define fields (in anonymous struct) with specific names. In our case:
struct{ColumnOne,ColumnTwo,..ColumnN interface{}}{}
P.S. I have used interface{}, you can use types depending on the data your column returns.
It worked for me by using a map type with interface. This helped me to save the SELECT query results without pre-defined struct or the column names.
dsn := fmt.Sprintf("%v:%v#tcp(%v:%v)/%v", myds.DBuser, myds.DBpassword, myds.DBhost, myds.DBport, myds.DBname)
db, err := gorm.Open(mysql.Open(dsn), &gorm.Config{})
if err != nil {
fmt.Println(err)
}
var result []map[string]interface{}
tx := db.Raw(query).Scan(&result)
if tx.Error != nil {
fmt.Println(tx.Error)
return
}
bytes, _ := json.Marshal(result)
fmt.Println(string(bytes))

Change dataType from bigquery.Value to string

I connect BigQuery with Go language as the following API documentation demonstrates,
https://cloud.google.com/bigquery/docs/reference/libraries?hl=en_US
After that, I need to get sql results in specific row and column, and judge if it equals a specific string. Could I change bigquery.Value to string and how to do that?
See how to use RowIterator.Next() here.
Next loads the next row into dst. Its return value is iterator.Done if there are no more results. Once Next returns iterator.Done, all subsequent calls will return iterator.Done.
dst may implement ValueLoader, or may be a *[]Value, *map[string]Value, or struct pointer.
Value is of type interface{} so if you are sure that value you have is string str := fmt.Sprintf("%v", row[i]) should work. It is often better to define a struct type that has members representing fields of query result row (with types mapped according to the table in documentation I linked above) and give the pointer to it to RowIterator.Next() instead of slice/map of bigquery.Value.
type myRow struct {
Name string
Num int
}
// ...
q := client.Query("select name, num from t1")
it, err := q.Read(ctx)
// handle err
for {
// instead of: var row []bigquery.Value
var row myRow // <-- use custom struct type here
err := it.Next(&row)
if err == iterator.Done {
break
}
// handle err != nil
someFuncThatTakesString(row.Name)
}

Query result is memory address

I'm new to go and still confused about pointers but I have followed the instructions for querying multiple rows but the result I get back is series of memory addresses instead of actual values.
This same structure, minus the rows.Next() works just fine for a single user so I'm confused as to the origin of the problem here.
Ultimately I'm trying to use the results of the function in a template but I'm trying to figure out the structure of it so I can range it in my HTML.
For example, if I try to run the code below, I get something like: &{[0xc... 0xc... 0xc...]}
type User struct {
Id int `json:"int"`
Name string `json:"name"`
Role string `json:"role"`
}
type Users struct {
Users []*User
}
func getUsers(company string) *Users {
users := Users{}
rows, err := db.Query("SELECT Id, Name, Role...")
// Check err
defer rows.Close()
for rows.Next() {
user := &User{}
err = rows.Scan(&user.Id, &user.Name, &user.Role)
// Check err
users.Users = append(users.Users, user)
}
err = rows.Err()
// Check err
return &users
}
This is how I'm attempting to use the function
func userView(w http.ResponseWriter, r *http.Request) {
res := getUsers("test") // Should return 3 results
fmt.Println(res.Users)
}
The problem isn't in your fetching of the data, it's in your display of the data. fmt.Println() prints memory addresses when given pointers--so it's behaving exactly as expected.
If you instead do:
fmt.Printf("%+v", res.Users)
you'll get a different result, probably closer to what you expect.
If you're planning to use a template, then you should do so--your template should be able to access the fields of each User just fine.
But the short answer is: Your testing method is invalid.
Type Users is a slice of pointers. If you print the return value of getUsers it looks like a bunch of memory addresses. This is OK.
If you want to print something more meaningful, write a String() method for Users in which you dereference each pointer and build a string containing struct fields.

LoadRelated of a list in beego

I am wondering how the correct approach is to load related fields in beego.
The doc explains it like this:
type User struct {
Id int
Name string
Posts []*Post `orm:"reverse(many)"`
}
user := User{Id: 1}
err := dORM.Read(&user)
num, err := dORM.LoadRelated(&user, "Posts")
This makes sense as long as I only query one record. What is the correct way to fetch related fields when I query all users?
A possible solution would be like this:
var users []*User
o.QueryTable(new(User)).All(&users)
for _, user := range users {
o.LoadRelated(controlCategory, "Posts")
}
However, this means I have to loop everytime over the complete list and make for every record a DB query to load all records.
Any suggestions? Thanks!

Writing generic data access functions in Go

I'm writing code that allows data access from a database. However, I find myself repeating the same code for similar types and fields. How can I write generic functions for the same?
e.g. what I want to achieve ...
type Person{FirstName string}
type Company{Industry string}
getItems(typ string, field string, val string) ([]interface{}) {
...
}
var persons []Person
persons = getItems("Person", "FirstName", "John")
var companies []Company
cs = getItems("Company", "Industry", "Software")
So you're definitely on the right track with the idea of returning a slice of nil interface types. However, you're going to run into problems when you try accessing specific members or calling specific methods, because you're not going to know what type you're looking for. This is where type assertions are going to come in very handy. To extend your code a bit:
getPerson(typ string, field string, val string) []Person {
slice := getItems(typ, field, val)
output := make([]Person, 0)
i := 0
for _, item := range slice {
// Type assertion!
thing, ok := item.(Person)
if ok {
output = append(output, thing)
i++
}
}
return output
}
So what that does is it performs a generic search, and then weeds out only those items which are of the correct type. Specifically, the type assertion:
thing, ok := item.(Person)
checks to see if the variable item is of type Person, and if it is, it returns the value and true, otherwise it returns nil and false (thus checking ok tells us if the assertion succeeded).
You can actually, if you want, take this a step further, and define the getItems() function in terms of another boolean function. Basically the idea would be to have getItems() run the function pass it on each element in the database and only add that element to the results if running the function on the element returns true:
getItem(critera func(interface{})bool) []interface{} {
output := make([]interface{}, 0)
foreach _, item := range database {
if criteria(item) {
output = append(output, item)
}
}
}
(honestly, if it were me, I'd do a hybrid of the two which accepts a criteria function but also accepts the field and value strings)
joshlf13 has a great answer. I'd expand a little on it though to maintain some additional type safety. instead of a critera function I would use a collector function.
// typed output array no interfaces
output := []string{}
// collector that populates our output array as needed
func collect(i interface{}) {
// The only non typesafe part of the program is limited to this function
if val, ok := i.(string); ok {
output = append(output, val)
}
}
// getItem uses the collector
func getItem(collect func(interface{})) {
foreach _, item := range database {
collect(item)
}
}
getItem(collect) // perform our get and populate the output array from above.
This has the benefit of not requiring you to loop through your interface{} slice after a call to getItems and do yet another cast.

Resources