SQLMock - Mocking out chained DELETE query - go

I'm currently trying to write a test for a Database query, but stuck on trying to figure out how to mock it out.
I have the following struct (for your reference):
type User struct {
ID uint `gorm:"column:id;primary_key"`
CreatedAt time.Time `gorm:"column:created_at"`
UpdatedAt time.Time `gorm:"column:updated_at"`
GroupID uint `gorm:"column:group_id" sql:"index"`
}
and I have a query that deletes all Users with the same GroupID:
/*
user is an empty User struct
groupID is an uint declared during test initialization (in this case, set to value 1)
*/
err := d.db.Where("group_id = ?", groupID).Delete(&user).GetErrors()
Apparently the above query results in the following (taken out from the test error):
call to ExecQuery 'DELETE FROM "users" WHERE (group_id = $1)' with args [{Name: Ordinal:1 Value:1}]
I can match the query string, but I'm unable to match the argument since it is passed in as a struct. Is it possible to mock this call out with go-sqlmock, or do I have to change my query to be able to mock it out?

yes, this can be mocked with go-sqlmock,
For this query err = s.Db.Delete(&user).Error
I have mocked like this
db, mock, err := sqlmock.New()
if err != nil {
t.Fatalf("an error '%s' was not expected when opening a stub database connection", err)
}
defer db.Close()
mock.ExpectBegin()
mock.ExpectExec(regexp.QuoteMeta("DELETE")).WithArgs(1).WillReturnResult(sqlmock.NewResult(1, 1))
mock.ExpectCommit()
gormdb, err := gorm.Open("postgres", db)
if err != nil {
t.Errorf("Failed to open gorm db, got error: %v", err)
}

Related

How do I pass a protobuf object as a row to sqlmock.AddRow in golang?

I am trying to unit test my go code using sqlmock.
Here is the original code that I am trying to test.
func enrollCourse(db *gorm.DB, user_id string ,course_id string) error {
user := &usermodels.User{}
ref := db.First(user, "uuid = ?", user_id)
userPb := user.Protobuf()
fmt.Printf("user name %+v", user.name)
....
}
Here is my unit test
func TestEnrollCourse(t *testing.T){
db, mock, err := sqlmock.New()
if err != nil {
t.Fatalf("an error '%s' was not expected when opening a stub database connection", err)
}
defer db.Close()
rows := sqlmock.NewRows([]string{"user_id","user_name"}).
AddRow(1, "John")
mock.ExpectQuery(regexp.QuoteMeta(`SELECT * FROM "users" WHERE uuid = $1`)).WithArgs("user-fd3746c8-d32f-4fb8-8f6a-b6d72dcf2969").WillReturnRows(rows)
gdb, err := gorm.Open(postgres.New(postgres.Config{
Conn: db,
}), &gorm.Config{})
enrollCourse(gdb, "user-fd3746c8-d32f-4fb8-8f6a-b6d72dcf2969", "english_course")
....
}
I am expecting fmt.Printf("user name %+v", user.name) to print user name but its nil.
How do I correctly pass a protobuf object to addRow?
The datatype that you used for AddRow seems to be string. The best way to use protobuf in sqlmock is to create the object with the help of constructor that is created in your stub file and serialise it in the form of string and store it. Then in actual method (enrollCourse), you can get this serialised message and unmarshal it into appropriate model.

call to Rollback transaction, was not expected, next expectation is: ExpectedQuery

I am trying to wrote this test bellow, other tests works fine, however I am having problems with the UPDATE query
func TestDeleteWorkspace(t *testing.T) {
conn, mock, repository, err := setup()
defer conn.Close()
assert.NoError(t, err)
uid := uuid.New()
// mock.ExpectBegin()
mock.ExpectQuery(regexp.QuoteMeta(`UPDATE "workspaces" SET`)).WithArgs(sqlmock.AnyArg(), uid)
// mock.ExpectCommit()
var e bool
e, err = repository.Delete(uid)
assert.NoError(t, err)
assert.True(t, e)
err = mock.ExpectationsWereMet()
assert.NoError(t, err)
}
repository.Delete does this query
func (r *WorkspaceRepository) Delete(id any) (bool, error) {
if err := r.db.Delete(&model.Workspace{}, "id = ?", id).Error; err != nil {
return false, nil
}
return true, nil
}
Which runs this query
UPDATE "workspaces" SET "deleted_at"='2022-07-04 09:09:20.778' WHERE id = 'c4610193-b43a-4ed7-9ed6-9d67b3f97502' AND "workspaces"."deleted_at" IS NULL
I am using Soft-Delete, that is why it is an UPDATE and not a DELETE query
However, I get the following error
workspace_test.go:169:
Error Trace: workspace_test.go:169
Error: Received unexpected error:
there is a remaining expectation which was not matched: ExpectedQuery => expecting Query, QueryContext or QueryRow which:
- matches sql: 'UPDATE "workspaces" SET'
- is with arguments:
0 - 28e7aa46-7a22-4dc7-b3ce-6cf02af525ca
1 - {}
What I am doing wrong?
EDIT: It is a soft-delete operation, that why is a UPDATE and not a DELETE
My model
type Workspace struct {
ID uuid.UUID `gorm:"type:uuid;default:uuid_generate_v4()" json:"id"`
Name string `gorm:"not null,type:text" json:"name"`
CreatedAt time.Time `gorm:"autoCreateTime" json:"create_time"`
UpdatedAt time.Time `gorm:"autoUpdateTime" json:"update_time"`
DeletedAt gorm.DeletedAt `gorm:"index,->" json:"-"`
}
Error message is quite self-explanatory.
This is your query:
'UPDATE "workspaces" SET "deleted_at"=$1 WHERE id = $2 AND "workspaces"."deleted_at" IS NULL'
it includes 2 arguments:
"deleted_at"=$1 WHERE id = $2
You set only 1 in your SQL mock:
.WithArgs(uid)
You need to send both arguments in mock.
It is not reliable to use Time.Now() in test because that value occasionally is going to be a few nanoseconds different from the value you set in code and test will fail.
The quick and dirty fix is to use sqlmock.AnyArg():
.WithArgs(sqlmock.AnyArg(), uid)
A more sophisticated alternative is to write custom Argument that checks type and compares value with time.Now(). Difference should be less than a few seconds.
See an example: https://github.com/DATA-DOG/go-sqlmock#matching-arguments-like-timetime

How to design RestAPI for too many tables in Golang

I think if i keep using the method below, i'll have to write too much code.
I declared structures for all the tables. and i used the go validate package for validation.
[types.go]
type TableA struct {
Field1 string `json:"field1" validate:"required, max=10"`
Field2 int `json:"field2" validate:"number"`
}
type TableB struct {
...
}
And i initialized the router for each method and connected the handlers.
[tableA.go]
router.Get("/table-a", r.Get_tableA_Handler),
router.Post("/table-a", r.Post_tableA_Handler),
router.Patch("/table-a", r.Patch_tableA_Handler),
router.Delete("/table-a", r.Delete_tableA_Handler)
...
Each handler parses the json in the request body, validates the data and call the db function.
[tableA_router.go]
func (rt *tableARouter) Post_tableA_Handler(w http.ResponseWriter, r *http.Request) error {
//Json to Struct
req := new(types.tableA)
if err := httputils.DecodeJsonBody(r, req); err != nil {
return err
}
// Validation
if err := validCheck(req); err != nil {
return err
}
// DB function
err := rt.insert_tableA_DB(r.Context(), req)
if err != nil {
return err
}
return rt.rd.JSON(w, http.StatusCreated, "Create Success")
}
...
func validCheck(data interface{}) error {
validate := validator.New()
err := validate.Struct(data)
return err
}
This is a DB function called from the handler function above (using Gorm)
[tableA_db.go]
func (rt *tableARouter) insert_tableA_DB(ctx context.Context, data *types.TableA) error {
// DB Connect
db, err := db.Open(rt.dbcfg)
if err != nil {
return err
}
defer db.Close()
tx := db.Begin()
defer tx.Rollback()
// == INSERT ==
query := `INSERT INTO table_a
(field1, field2, ...)
VALUES (?, ?, ...)`
result := tx.WithContext(ctx).Exec(query,
data.Field1, data.Field2, ...)
//Result
if result.Error != nil {
...
}
There are too many tables now... If there are 100 tables i have to write 100 handlers and 100 DB functions.
Is there any way to use something like /tables/{tableName}?
Please give me any advice.... Thank you.
You can use an ORM package, like GORM to make easier your work.
Or you can make an universal handler and with the reflect package, analyze your defined structs and make every SQL query dinamically. But it's not the best solution if any of your struct has inner slices, other embedded structs, or if you need to use joined tables you also have to deal with it manually. I have servers where we have more than 200 endpoints with more than 3-400 methods with 200+ SQL tables and the whole server was written by hand. But I can say, it's very rare when a handler and the DB func can be reused without modifying.
Maybe you can wrap the error handling, rollback/commit, json parse and response parts in a func then use it to call the DB methods.

How can I do conditional actions in SQLX

I have database store function:
func (p *ProductsRep) FindAll(PageNumber int, PaginationSize int, Query string) []*postgresmodels.Product {
Also I have SQL query look like this:
SELECT * FROM table_name.
Then I want to concat conditional action like WHERE some_value=3 if some value (in this case Query) exists then I want to get SELECT * FROM table_name WHERE some_value=3.
I tried to use fmt.Sprintf to concat, or strings.Join, or bytes.Buffer.WriteString. But everytime I getting this error:
I replace real value for understanding:
pq: column "Some value" does not exist.
How can I do "adaptive" queries, which depends on inputed function values.
I believe you are trying to query rows in the database by using parameters.
You need to make sure you don't pass this data in as RAW values, due to the potential risk of SQL injection. You can make queries by using store procedures
You can use the function Query to pass in your query with your parameters. In the example case this is $1. If you wanted to you could add $2, $3... etc depending on how many parameters you wanted to query
Here is two examples
Postgres
using "github.com/jackc/pgx/v4" driver
ctx := context.Background()
type Bar struct {
ID int64
SomeValue string
}
rows, err := conn.Query(ctx, `SELECT * FROM main WHERE some_value=$1`, "foo")
if err != nil {
fmt.Println("ERRO")
panic(err) // handle error
}
defer rows.Close()
var items []Bar
for rows.Next() {
var someValue string
var id int64
if err := rows.Scan(&id, &someValue); err != nil {
log.Fatal(err) // handle error
}
item := Bar{
ID: id,
SomeValue: someValue,
}
items = append(items, item)
}
fmt.Println(items)
MySQL Driver
https://golang.org/pkg/database/sql/#DB.QueryRow
type Bar struct {
ID int64
SomeValue string
}
rows, err := conn.Query(`SELECT * FROM main WHERE some_value=$1`, "foo")
if err != nil {
fmt.Println("ERRO")
panic(err) // handle error
}
defer rows.Close()
var items []Bar
for rows.Next() {
var someValue string
var id int64
if err := rows.Scan(&id, &someValue); err != nil {
log.Fatal(err) // handle error
}
item := Bar{
ID: id,
SomeValue: someValue,
}
items = append(items, item)
}
fmt.Println(items)

Partial updates of objects

I want to enable update functionality for my User object in my fiber/gorm backend. It works fine when I update all fields together using the Save function. However, when I do not have all fields present in the update request (for example only the Birthday field but not the Phone field) it overwrites the rest of the fields with their respective null values.
func UserUpdateByID(c *fiber.Ctx) error {
db := database.DBConn
// Parse the body to fit user entity
user := entities.User{}
if err := c.BodyParser(&user); err != nil {
return c.Status(500).SendString(err.Error())
}
// Update record
record := db.Save(&user)
if record.Error != nil {
return c.Status(500).SendString(record.Error.Error())
}
return c.JSON(record.Value)
When I change the line with record := db.Save(&user) to
mappedData, _ := StructToMap(user)
record := db.Model(&entities.User{}).Update(mappedData)
I receive the error that Update can not handle map of interfaces: sql: converting argument $10 type: unsupported type map[string]interface {}, a map
Update 1:
The mentioned StructToMap function looks like this:
func StructToMap(obj interface{}) (newMap map[string]interface{}, err error) {
data, err := json.Marshal(obj)
if err != nil {
return
}
err = json.Unmarshal(data, &newMap) // Convert to a map
return
}
Update 2:
The User object looks like:
type User struct {
gorm.Model
Identity string
Birthday time.Time
Phone string
City string
...
ActivityData []Activity
}
Looking, on gorm doc(https://gorm.io/docs/update.html), you can do something like this :
Use the Updates instead of Update.
db.Model(&user).Updates(User{Name: "hello", Age: 18, Active: false})
You can also use a db.Debug, to show the final query that gorm made, and see if matches with what are you expecting.

Resources