How to instantiate models with DB connection - go

Constraints:
not have everything in "main.go" but models and handler files for each request register_post.go (handler) user.go (models)
not use globals
use dependency injection for easy testing
Here is my code so far. It works if you only use one model. But how can I extend this to allow for another handler file and another model?
main.go
// Initalise Env with a models.BookModel instance (which in turn wraps
// the connection pool).
env := &handler.Env{
Books: models.BookModel{DB: db},
}
// route call
v1.GET("/books", env.BooksIndex)
books_get.go handler
// // Initalise Env with a models.BookModel instance (which in turn wraps
// // the connection pool)
type Env struct {
// Replace the reference to models.BookModel with an interface
// describing its methods.
Books interface {
All() ([]models.Book, error)
}
}
func (env *Env) BooksIndex(c echo.Context) error {
// Execute the SQL query by calling the All() method.
bks, err := env.Books.All()
if err != nil {
fmt.Println(err)
return err
}
return c.JSON(http.StatusOK, bks)
}
books.go models
func (m BookModel) All() ([]Book, error) {
// some body
}

Related

When is right time to run Automigrate with GORM

Most Go/GORM examples I've seen show Automigrate being called immediately after opening the database connection, including GORM documentation here. For API services, this would be an expensive/wanted call with every API requests. So, I assume, for API services, Automigrate should be removed from regular flow and handled separately. Is my understanding correct?
From GORM Documentation
...
db, err := gorm.Open(sqlite.Open("test.db"), &gorm.Config{})
if err != nil {
panic("failed to connect database")
}
// Migrate the schema
db.AutoMigrate(&Product{})
...
It wouldn't happen on every API request. Not even close. It'd happen every time the application is started, so basically: connect to the DB in main, and run AutoMigrate there. Pass the connection as a dependency to your handlers/service packages/wherever you need them. The HTTP handler can just access it there.
Basically this:
package main
func main() {
db, err := gorm.Open(sqlite.Open("test.db"), &gorm.Config{})
if err != nil {
fmt.Printf("Failed to connect to DB: %v", err)
os.Exit(1)
}
// see below how this is handled
fRepo := foo.New(db) // all repos here
fRepo.Migrate() // this handles migrations
// create request handlers
fHandler := handlers.NewFoo(fRepo) // migrations have already been handled
mux := http.NewServeMux()
mux.HandleFunc("/foo/list", fHandler.List) // set up handlers
// start server etc...
}
Have the code that interacts with the DB in some package like this:
package foo
// The DB connection interface as you use it
type Connection interface {
Create()
Find()
AutoMigrate(any)
}
type Foo struct {
db Connection
}
func New(db Connection) *Foo {
return &Foo{
db: db,
}
}
func (f *Foo) Migrate() {
f.db.AutoMigrate(&Stuff{}) // all types this repo deals with go here
}
func (f *Foo) GetAll() ([]Stuff, error) {
ret := []Stuff{}
res := f.db.Find(&ret)
return ret, res.Error
}
Then have your handlers structured in a sensible way, and provide them with the repository (aka foo package stuff):
package handlers
type FooRepo interface {
GetAll() ([]Stuff, error)
}
type FooHandler struct {
repo FooRepo
}
func NewFoo(repo FooRepo) *FooHandler {
return &FooHandler{
repo: repo,
}
}
func (f *FooHandler) List(res http.ResponseWriter, req *http.Request) {
all, err := f.repo.GetAll()
if err != nil {
res.WriteHeader(http.StatusInternalServerError)
io.WriteString(w, err.Error())
return
}
// write response as needed
}
Whenever you deploy an updated version of your application, the main function will call AutoMigrate, and the application will handle requests without constantly re-connecting to the DB or attempting to handle migrations time and time again.
I don't know why you'd think that your application would have to run through the setup for each request, especially given that your main function (or some function you call from main) explicitly creates an HTTP server, and listens on a specific port for requests. The DB connection and subsequent migrations should be handled before you start listening for requests. It's not part of handling requests, ever...

Is this dependecy injection pattern thread safe?

I'm having a hard time coming up with a clean pattern to inject dependencies in a REST server that allows me to write isolated unit tests. The below structure seems to work but I'm not sure if it's thread safe.
store:
package store
type InterfaceStore interface {
Connect()
Disconnect()
User() interfaceUser
}
// Wiring up
type store struct {
db *mongo.Database
}
func (s *store) Connect() {
client, err := mongo.Connect()
if err != nil {
log.Fatal(err.Error())
}
s.db = client.Database()
}
func (s *store) Disconnect() {
s.db.Client().Disconnect(context.TODO())
}
func (s *store) User() interfaceUser {
return &user{s.db}
}
// Exposed from the package to create a store instance
func GetStore() InterfaceStore {
return &store{}
}
// User related
type interfaceUser interface {
InsertOne(models.User) (string, error)
}
type user struct {
db *mongo.Database
}
func (u *user) InsertOne(user models.User) (primitive.ObjectID, error) {
collection := u.db.Collection(collectionUsers)
// persisting user in DB
}
server:
package server
type server struct{}
func (s *server) Start() {
storeInstance := store.GetStore()
storeInstance.Connect()
defer storeInstance.Disconnect()
r := gin.Default()
keys := keys.GetKeys()
routes.InitRoutes(r, storeInstance)
port := fmt.Sprintf(":%s", keys.PORT)
r.Run(port)
}
func CreateInstance() *server {
return &server{}
}
routes:
package routes
func InitRoutes(router *gin.Engine, store store.InterfaceStore) {
router.Use(middlewares.Cors)
// createSubrouter creates a Gin routerGroup with the prefix "/user"
userRoutes(createSubrouter("/user", router), store)
}
func userRoutes(router *gin.RouterGroup, store store.InterfaceStore) {
controller := controllers.GetUserController(store)
router.GET("/", controller.Get)
}
controllers:
package controllers
type userControllers struct {
UserService services.InterfaceUser
}
func (u *userControllers) Get(c *gin.Context) {
userDetails, _ := u.UserService.FetchAllInformation(bson.M{"_id": userData.(models.User).ID})
utils.RespondWithJSON(c, userDetails)
}
func GetUserController(store store.InterfaceStore) userControllers {
userService := services.GetUserService(store)
return userControllers{
UserService: &userService,
}
}
services:
package services
type InterfaceUser interface {
FetchAllInformation(bson.M) (*models.User, error)
}
type user struct {
store store.InterfaceStore
}
func (u *user) FetchAllInformation(filter bson.M) (*models.User, error) {
user, err := u.store.User().FindOne(filter)
if err != nil {
return nil, err
}
return user, nil
}
func GetUserService(store store.InterfaceStore) user {
return user{
store: store,
}
}
By using interfaces I'm able to mock the entire service when writing tests for the controller and I can mock the entire store to test the service component without hitting the DB.
I'm wondering if the store instance is safely shared across the code because the interfaces are no pointers. Does that mean a copy of the store is created every time I pass it down the tree?
The type user struct {} definition states store is anything that implements the store.InterfaceStore interface.
If you look carefully, you're implementing it with pointer receivers. That means the (instance pointed by the) receiver will be shared.
If your mock implements them over the value-type it will be copied on method call and you'll be safe, but it will also mean this mock won't be holding new state after the method calls, which I guess is not what you want.
Bottom line, it's not really about how you defined it in the struct, by value or by reference, but what the methods accept as receiver.

Wrapping a db object in Go and running two methods in the same transaction

In the effort of learning Go a bit better, I am trying to refactor a series of functions which accept a DB connection as the first argument into struct methods and something a bit more "idiomatically" Go.
Right now my "data store" methods are something like this:
func CreateA(db orm.DB, a *A) error {
db.Exec("INSERT...")
}
func CreateB(db orm.DB, b *B) error {
db.Exec("INSERT...")
}
These the functions work perfectly fine. orm.DB is the DB interface of go-pg.
Since the two functions accept a db connection I can either pass an actual connection or a transaction (which implements the same interface). I can be sure that both functions issuing SQL INSERTs run in the same transaction, avoiding having inconsistent state in the DB in case either one of them fails.
The trouble started when I decided to read more about how to structure the code a little better and to make it "mockable" in case I need to.
So I googled a bit, read the article Practical Persistence in Go: Organising Database Access and tried to refactor the code to use proper interfaces.
The result is something like this:
type Store {
CreateA(a *A) error
CreateB(a *A) error
}
type DB struct {
orm.DB
}
func NewDBConnection(p *ConnParams) (*DB, error) {
.... create db connection ...
return &DB{db}, nil
}
func (db *DB) CreateA(a *A) error {
...
}
func (db *DB) CreateB(b *B) error {
...
}
which allows me to write code like:
db := NewDBConnection()
DB.CreateA(a)
DB.CreateB(b)
instead of:
db := NewDBConnection()
CreateA(db, a)
CreateB(db, b)
The actual issue is that I lost the ability to run the two functions in the same transaction. Before I could do:
pgDB := DB.DB.(*pg.DB) // convert the interface to an actual connection
pgDB.RunInTransaction(func(tx *pg.Tx) error {
CreateA(tx, a)
CreateB(tx, b)
})
or something like:
tx := db.DB.Begin()
err = CreateA(tx, a)
err = CreateB(tx, b)
if err != nil {
tx.Rollback()
} else {
tx.Commit()
}
which is more or less the same thing.
Since the functions were accepting the common interface between a connection and a transaction I could abstract from my model layer the transaction logic sending down either a full connection or a transaction. This allowed me to decide in the "HTTP handler" when to create a trasaction and when I didn't need to.
Keep in mind that the connection is a global object representing a pool of connections handled automatically by go, so the hack I tried:
pgDB := DB.DB.(*pg.DB) // convert the interface to an actual connection
err = pgDB.RunInTransaction(func(tx *pg.Tx) error {
DB.DB = tx // replace the connection with a transaction
DB.CreateA(a)
DB.CreateB(a)
})
it's clearly a bad idea, because although it works, it works only once because we replace the global connection with a transaction. The following request breaks the server.
Any ideas? I can't find information about this around, probably because I don't know the right keywords being a noob.
I've done something like this in the past (using the standard sql package, you may need to adapt it to your needs):
var ErrNestedTransaction = errors.New("nested transactions are not supported")
// abstraction over sql.TX and sql.DB
// a similar interface seems to be already defined in go-pg. So you may not need this.
type executor interface {
Exec(query string, args ...interface{}) (sql.Result, error)
Query(query string, args ...interface{}) (*sql.Rows, error)
QueryRow(query string, args ...interface{}) *sql.Row
}
type Store struct {
// this is the actual connection(pool) to the db which has the Begin() method
db *sql.DB
executor executor
}
func NewStore(dsn string) (*Store, error) {
db, err := sql.Open("sqlite3", dsn)
if err != nil {
return nil, err
}
// the initial store contains just the connection(pool)
return &Store{db, db}, nil
}
func (s *Store) RunInTransaction(f func(store *Store) error) error {
if _, ok := s.executor.(*sql.Tx); ok {
// nested transactions are not supported!
return ErrNestedTransaction
}
tx, err := s.db.Begin()
if err != nil {
return err
}
transactedStore := &Store{
s.db,
tx,
}
err = f(transactedStore)
if err != nil {
tx.Rollback()
return err
}
return tx.Commit()
}
func (s *Store) CreateA(thing A) error {
// your implementation
_, err := s.executor.Exec("INSERT INTO ...", ...)
return err
}
And then you use it like
// store is a global object
store.RunInTransaction(func(store *Store) error {
// this instance of Store uses a transaction to execute the methods
err := store.CreateA(a)
if err != nil {
return err
}
return store.CreateB(b)
})
The trick is to use the executor instead of the *sql.DB in your CreateX methods, which allows you to dynamically change the underlying implementation (tx vs. db). However, since there is very little information out there on how to deal with this issue, I can't assure you that this is the "best" solution. Other suggestions are welcome!

How to make a struct accept one of two types as an argument?

I have a struct DbConnector which I want to use as a proxy to communicate with a database.
This struct has method Init(db *sql.DB).
Depending on conditions, I want to be able to initialise it with another struct, like DummyDatabaseConnection for testing.
How do I define the signature of Init() so that it accepts either *sql.DB or *DummyDatabaseConnection?
Define an Interface with some methods you need to call for *sql.DB & *DummyDatabaseConnection
type DBInterface interface {
Ping() error
Close() error
// Some other Methods that you need
}
Now your DummyDatabaseConnection should satisfy your DBInterface.
type DummyDatabaseConnection struct {
}
func(d *DummyDatabaseConnection) Ping()error {
return nil
}
func(d *DummyDatabaseConnection) Close()error {
return nil
}
Use your Interface as argument
func (d *DbConnector) Init(db DBInterface) {
db.Ping()
db.Close()
}
Call with which one you need.
dbConnector := &DbConnector{}
// Call with *sql.DB
db := &sql.DB{}
dbConnector.Init(db)
// Call with *DummyDatabaseConnection
db := &DummyDatabaseConnection{}
dbConnector.Init(db)
From your Init(db DBInterface) method, you only can call methods those are in DBInterface interface
Check this post
Hope this will help

How to pass arguments to router handlers in Golang using Gin web framework?

I'm using Gin, https://gin-gonic.github.io/gin/, to build a simple RESTful JSON API with Golang.
The routes are setup with something like this:
func testRouteHandler(c *gin.Context) {
// do smth
}
func main() {
router := gin.Default()
router.GET("/test", testRouteHandler)
router.Run(":8080")
}
My question is how can I pass down an argument to the testRouteHandler function? For example a common database connection could be something that one wants to reuse among routes.
Is the best way to have this in a global variable? Or is there some way in Go to pass along an extra variable to the testRouteHandler function? Are there optional arguments for functions in Go?
PS. I'm just getting started in learning Go, so could be something obvious that I'm missing :)
I would avoid stuffing 'application scoped' dependencies (e.g. a DB connection pool) into a request context. Your two 'easiest' options are:
Make it a global. This is OK for smaller projects, and *sql.DB is thread-safe.
Pass it explicitly in a closure so that the return type satisfies gin.HandlerFunc
e.g.
// SomeHandler returns a `func(*gin.Context)` to satisfy Gin's router methods
// db could turn into an 'Env' struct that encapsulates all of your
// app dependencies - e.g. DB, logger, env vars, etc.
func SomeHandler(db *sql.DB) gin.HandlerFunc {
fn := func(c *gin.Context) {
// Your handler code goes in here - e.g.
rows, err := db.Query(...)
c.String(200, results)
}
return gin.HandlerFunc(fn)
}
func main() {
db, err := sql.Open(...)
// handle the error
router := gin.Default()
router.GET("/test", SomeHandler(db))
router.Run(":8080")
}
Using the link i posted on comments, I have created a simple example.
package main
import (
"log"
"github.com/gin-gonic/gin"
"github.com/jinzhu/gorm"
_ "github.com/mattn/go-sqlite3"
)
// ApiMiddleware will add the db connection to the context
func ApiMiddleware(db gorm.DB) gin.HandlerFunc {
return func(c *gin.Context) {
c.Set("databaseConn", db)
c.Next()
}
}
func main() {
r := gin.New()
// In this example, I'll open the db connection here...
// In your code you would probably do it somewhere else
db, err := gorm.Open("sqlite3", "./example.db")
if err != nil {
log.Fatal(err)
}
r.Use(ApiMiddleware(db))
r.GET("/api", func(c *gin.Context) {
// Don't forget type assertion when getting the connection from context.
dbConn, ok := c.MustGet("databaseConn").(gorm.DB)
if !ok {
// handle error here...
}
// do your thing here...
})
r.Run(":8080")
}
This is just a simple POC. But i believe it's a start.
Hope it helps.
Late to the party, so far here is my proposal. Encapsulate methods into the object with private/public vars in it:
package main
import (
"log"
"github.com/gin-gonic/gin"
"github.com/jinzhu/gorm"
_ "github.com/mattn/go-sqlite3"
)
type HandlerA struct {
Db gorm.DB
}
func (this *HandlerA) Get(c *gin.Context) {
log.Info("[%#f]", this.Db)
// do your thing here...
}
func main() {
r := gin.New()
// Init, should be separate, but it's ok for this sample:
db, err := gorm.Open("sqlite3", "./example.db")
if err != nil {
log.Fatal(err)
}
Obj := new(HandlerA)
Obj.Db = db // Or init inside Object
r := gin.New()
Group := r.Group("api/v1/")
{
Group.GET("/storage", Obj.Get)
}
r.Run(":8080")
}
Handler closures are a good option, but that works best when the argument is used in that handler alone.
If you have route groups, or long handler chains, where the same argument is needed in multiple places, you should set values into the Gin context.
You can use function literals, or named functions that return gin.HandlerFunc to do that in a clean way.
Example injecting configs into a router group:
Middleware package:
func Configs(conf APIV1Config) gin.HandlerFunc {
return func(c *gin.Context) {
c.Set("configKey", conf) // key could be an unexported struct to ensure uniqueness
}
}
Router:
conf := APIV1Config{/* some api configs */}
// makes conf available to all routes in this group
g := r.Group("/api/v1", middleware.Configs(conf))
{
// ... routes that all need API V1 configs
}
This is also easily unit-testable. Assuming that you test the single handlers, you can set the necessary values into the mock context:
w := httptest.NewRecorder()
c, _ := gin.CreateTestContext(w)
c.Set("configKey", /* mock configs */)
apiV1FooHandler(c)
Now in the case of application-scoped dependencies (db connections, remote clients, ...), I agree that setting these directly into the Gin context is a poor solution.
What you should do then, is to inject providers into the Gin context, using the pattern outlined above:
Middleware package:
// provider could be an interface for easy mocking
func DBProvider(provider database.Provider) gin.HandlerFunc {
return func(c *gin.Context) {
c.Set("providerKey", provider)
}
}
Router:
dbProvider := /* init provider with db connection */
r.Use(DBProvider(dbProvider)) // global middleware
// or
g := r.Group("/users", DBProvider(dbProvider)) // users group only
Handler (you can greatly reduce the boilerplate code by putting these context getters in some helper function):
// helper function
func GetDB(c *gin.Context) *sql.DB {
provider := c.MustGet("providerKey").(database.Provider)
return provider.GetConn()
}
func createUserHandler(c *gin.Context) {
db := GetDB(c) // same in all other handlers
// ...
}
I like wildneuro's example but would do a one liner to setup the handler
package main
import (
"log"
"github.com/gin-gonic/gin"
"github.com/jinzhu/gorm"
_ "github.com/mattn/go-sqlite3"
)
type HandlerA struct {
Db gorm.DB
}
func (this *HandlerA) Get(c *gin.Context) {
log.Info("[%#f]", this.Db)
// do your thing here...
}
func main() {
r := gin.New()
// Init, should be separate, but it's ok for this sample:
db, err := gorm.Open("sqlite3", "./example.db")
if err != nil {
log.Fatal(err)
}
r := gin.New()
Group := r.Group("api/v1/")
{
Group.GET("/storage", (&HandlerA{Db: db}).Get)
}
r.Run(":8080")
}
Let me try to explain in detail so that you won't get confused.
Depending on the incoming route, you want to call a controller function. Lets say your incoming route is /books and your controller is BooksController
Your BooksController will try to fetch the books from the database and returns a response.
Now, you want this a handler within your BooksController so that you can access database.
I would do something like this. Let's assume that you are using dynamoDB and the aws sdk provides *dynamodb.DynamoDB. Depending on your db, change this variable.
Create a struct as below.
type serviceConnection struct {
db *dynamoDB.DynamoDB
// You can have all services declared here
// which you want to use it in your controller
}
In your main function, get the db connection information. Let's say you already have a function initDatabaseConnection which returns a handler to db, something like below.
db := initDatabaseConnection() -> returns *dynamodb.DynamoDB
Set db to a struct variable.
conn := new(serviceConnection)
conn.db = db
Call the gin request method with a receiver handler as below.
r := gin.Default()
r.GET("/books", conn.BooksController)
As you see, the gin handler is a controller method which has your struct instance as a receiver.
Now, create a controller method with serviceConnection struct receiver.
func (conn *serviceConnection) BooksController(c *gin.Context) {
books := getBooks(conn.db)
}
As you see here, you have access to all the serviceConnection struct variables and you can use them in your controller.
Alright, I have given you a simple example. It should work. You can extend it as per your need
func main() {
router := gin.Default()
router.GET("/test/:id/:name", testRouteHandler)
router.Run(":8080")
}
func testRouteHandler(c *gin.Context) {
id := c.Params.ByName("id")
name := c.Params.ByName("name")
}
Now you will have to call your handler as below
http://localhost:8080/test/1/myname

Resources