What is the best way to use gorm in multithreaded application? - go

I have an application that opens a lot routines. Lets say 2000 routines. Each routine needs access to DB, or at least needs update/select data from DB.
My current approach is the following:
Routine gets *gorm.DB with db.GetConnection(), this is the code of this function:
func GetConnection() *gorm.DB {
DBConfig := config.GetConfig().DB
db, err := gorm.Open("mysql", DBConfig.DBUser+":"+DBConfig.DBPassword+"#/"+DBConfig.DBName+"?charset=utf8mb4")
if err != nil {
panic(err.Error())
}
return db
}
then routines calls another function from some storage package and passes *gorm.DB to function and closes the connection, it looks like that:
dbConnection := db.GetConnection()
postStorage.UpdateSomething(dbConnection)
db.CloseConnection(dbConnection)
The above is only example, the main idea is that each routine opens new connection and I don't like it. Because it may overload the DB. In result I got the next MySQL error:
[mysql] 2020/07/16 19:34:26 packets.go:37: read tcp 127.0.0.1:44170->127.0.0.1:3306: read: connection reset by peer
The question is about good pattern how to use gorm package in multiroutines application ?

*gorm.DB is multi thread safe, and you could use one *gorm.DB in multi routines. You could init it once and get it whenever you want. Demo:
package db
var db *gorm.DB
fund init() {
DBConfig := config.GetConfig().DB
db, err := gorm.Open("mysql", DBConfig.DBUser+":"+DBConfig.DBPassword+"#/"+DBConfig.DBName+"?charset=utf8mb4")
if err != nil {
panic(err.Error())
}
}
func GetConnection() *gorm.DB {
return db;
}

Related

When is right time to run Automigrate with GORM

Most Go/GORM examples I've seen show Automigrate being called immediately after opening the database connection, including GORM documentation here. For API services, this would be an expensive/wanted call with every API requests. So, I assume, for API services, Automigrate should be removed from regular flow and handled separately. Is my understanding correct?
From GORM Documentation
...
db, err := gorm.Open(sqlite.Open("test.db"), &gorm.Config{})
if err != nil {
panic("failed to connect database")
}
// Migrate the schema
db.AutoMigrate(&Product{})
...
It wouldn't happen on every API request. Not even close. It'd happen every time the application is started, so basically: connect to the DB in main, and run AutoMigrate there. Pass the connection as a dependency to your handlers/service packages/wherever you need them. The HTTP handler can just access it there.
Basically this:
package main
func main() {
db, err := gorm.Open(sqlite.Open("test.db"), &gorm.Config{})
if err != nil {
fmt.Printf("Failed to connect to DB: %v", err)
os.Exit(1)
}
// see below how this is handled
fRepo := foo.New(db) // all repos here
fRepo.Migrate() // this handles migrations
// create request handlers
fHandler := handlers.NewFoo(fRepo) // migrations have already been handled
mux := http.NewServeMux()
mux.HandleFunc("/foo/list", fHandler.List) // set up handlers
// start server etc...
}
Have the code that interacts with the DB in some package like this:
package foo
// The DB connection interface as you use it
type Connection interface {
Create()
Find()
AutoMigrate(any)
}
type Foo struct {
db Connection
}
func New(db Connection) *Foo {
return &Foo{
db: db,
}
}
func (f *Foo) Migrate() {
f.db.AutoMigrate(&Stuff{}) // all types this repo deals with go here
}
func (f *Foo) GetAll() ([]Stuff, error) {
ret := []Stuff{}
res := f.db.Find(&ret)
return ret, res.Error
}
Then have your handlers structured in a sensible way, and provide them with the repository (aka foo package stuff):
package handlers
type FooRepo interface {
GetAll() ([]Stuff, error)
}
type FooHandler struct {
repo FooRepo
}
func NewFoo(repo FooRepo) *FooHandler {
return &FooHandler{
repo: repo,
}
}
func (f *FooHandler) List(res http.ResponseWriter, req *http.Request) {
all, err := f.repo.GetAll()
if err != nil {
res.WriteHeader(http.StatusInternalServerError)
io.WriteString(w, err.Error())
return
}
// write response as needed
}
Whenever you deploy an updated version of your application, the main function will call AutoMigrate, and the application will handle requests without constantly re-connecting to the DB or attempting to handle migrations time and time again.
I don't know why you'd think that your application would have to run through the setup for each request, especially given that your main function (or some function you call from main) explicitly creates an HTTP server, and listens on a specific port for requests. The DB connection and subsequent migrations should be handled before you start listening for requests. It's not part of handling requests, ever...

gorm multiple databases connection management

I have a requirement where my application talks to different
databases . How do i manage connections in the gorm. Is there any
way gorm supports connection management for multiple database. or i
need to create map which holds all database connections.
if val, ok := selector.issure_db[issuer]; ok {
return val , nil;
} else {
var dbo *db.DB;
selector.mu.Lock()
dbo, err := db.NewDb(Config)
if err != nil {
boot.Logger(ctx).Fatal(err.Error())
}
selector.issure_db[issuer] = dbo;
selector.mu.Unlock()
return repo ,nil;
}
Is there is a better way to do this?
You can use the dbresolver plugin for GORM. It manages multiple sources and replicas and maintains an underlying connection pool for the group. You can even map models in your app to the correct database using the config.
Example from the docs:
import (
"gorm.io/gorm"
"gorm.io/plugin/dbresolver"
"gorm.io/driver/mysql"
)
db, err := gorm.Open(mysql.Open("db1_dsn"), &gorm.Config{})
db.Use(dbresolver.Register(dbresolver.Config{
// use `db2` as sources, `db3`, `db4` as replicas
Sources: []gorm.Dialector{mysql.Open("db2_dsn")},
Replicas: []gorm.Dialector{mysql.Open("db3_dsn"), mysql.Open("db4_dsn")},
// sources/replicas load balancing policy
Policy: dbresolver.RandomPolicy{},
}).Register(dbresolver.Config{
// use `db1` as sources (DB's default connection), `db5` as replicas for `User`, `Address`
Replicas: []gorm.Dialector{mysql.Open("db5_dsn")},
}, &User{}, &Address{}).Register(dbresolver.Config{
// use `db6`, `db7` as sources, `db8` as replicas for `orders`, `Product`
Sources: []gorm.Dialector{mysql.Open("db6_dsn"), mysql.Open("db7_dsn")},
Replicas: []gorm.Dialector{mysql.Open("db8_dsn")},
}, "orders", &Product{}, "secondary"))
You can create a package called database and write an init function in the init.go file which can create a DB object to connect with database for each database you have. And you can use this db object everywhere in the application which would enable connection pooling as well.
init.go
var db *gorm.DB
func init() {
var err error
dataSourceName := fmt.Sprintf("%s:%s#tcp(%s:%d)/%s?charset=utf8&parseTime=True&loc=Local", dbUser, dbPassword, dbHost, dbPort, dbName)
db, err = gorm.Open("mysql", dataSourceName)
db.DB().SetConnMaxLifetime(10 * time.Second)
db.DB().SetMaxIdleConns(10)
//initialise other db objects here
}
users.go
func getFirstUser() (user User) {
db.First(&user)
return
}
PS> This solution would be efficient if you have to connect to 1 or 2 database. If you need to connect to multiple databases at the same time, you should be using dbresolver plugin.
Old Answer
You can write a separate function which returns current database connection object every time you call the function.
func getDBConnection(dbUser, dbPassword, dbHost, dbName string) (db *gorm.DB, err error) {
dataSourceName := fmt.Sprintf("%s:%s#tcp(%s:%d)/%s?charset=utf8&parseTime=True&loc=Local", dbUser, dbPassword, dbHost, dbPort, dbName)
db, err = gorm.Open("mysql", dataSourceName)
db.DB().SetConnMaxLifetime(10 * time.Second)
return
}
And call defer db.Close() everytime after you call the getDBConnection function.
func getFirstUser() (user User) {
db, _ := getDBConnection()
defer db.Close()
db.First(&user)
return
}
This way your connections will be closed every time after you have executed the query.

Best approach for Sql.Open Global connection

I'm trying to create a global DB sql.Open connection between multiple .go packages.
Doing this will allow me to key in my DB connection only once instead of doing it multiple times between separate files.
I've created this file so far:
import (
"database/sql"
)
var DB *sql.DB
func ConnectDB() {
db, _ := sql.Open("mysql", "user:password#(localhost:port)/db")
DB = db
}
I'm assigning my connection to the global variable DB. I know that I could use DB.query or otherpkg.DB.
I can avoid the DB = db by not using the := and doing, the := assign a new variable into the function's scope:
var db *sql.DB
func ConnectDB() {
var err error
db, err = sql.Open(.....)
.....
}
My other files look like the following:
func TestNew() {
fmt.Println("Test for MySQL")
db, _ := sql.Open(
"mysql", "user:password#(localhost:port)/db")
Delete, err := db.Query("DELETE * FROM table1 WHERE id = ?;", id)
if err != nil {
panic(err.Error())
}
defer Delete.Close()
}
I don't want to go into each separate file inside of my packages and change the DB credentials 10x. I'd like to use the func ConnectDB and have that be the file which controls what DB/credentials are used for the other packages.
A common way of doing this is initializing the connection in main, and passing it down to all the functions that need it.
Another way of doing it is to use a package other than main, and set a package level variable there:
package db
var DB *sql.DB
func InitDB(dbURL string) {
var err error
DB, err=sql.Open(dbURL)
if err!=nil {
panic(err)
}
}
You call InitDB from main with the DB URL. When you need the DB connection, you import the db package, and use the connection as db.DB.

Golang how can I do a Dependency Injection to store some string values

I am using a mysql database and have many different Functions/Methods that interact with the database. For every Function I offcourse have to supply the Database Credentials such as
ReadAll.go
func ReadAll() {
db, err := sql.Open("mysql",
"user:password#tcp(127.0.0.1:3306)/hello")
if err != nil {
log.Fatal(err)
}
defer db.Close()
}
The part of "mysql",
"user:password#tcp(127.0.0.1:3306)/hello" never changes and I am supplying that to every Function that interacts with DB. I was wondering how can I for instance create a new File say DataBase.go put those credentials into some global variable and then reference when I need those strings ? That way if I have to change the credentials I only have to change them in 1 place.
I want to do something like
Database.go
const GlobalDB := "mysql","user:password#tcp(127.0.0.1:3306)/hello"
then
ReadAll.go
func ReadAll() {
db, err := sql.Open(GlobalDB)
if err != nil {
log.Fatal(err)
}
defer db.Close()
}
I am brand new to Golang but trying to figure this out.
I would probably do this by opening a session to the database once, then pass this session around to any function or method that may need it. This has a few potential problems:
You may need to lock access to it, so you don't co-mingle multiple queries on the same session (but it may be that your DB library ensures this, FWIW "database/sql" is concurrency-safe and recommends NOT opening short-lived database connections)
You can't safely close the session as it may still be in use elsewhere.
Another way would be to have a function that returns a DB sesssion, so instead of doing:
db, err := sql.Open("mysql", "user:password#tcp(127.0.0.1:3306)/hello")
You do the following:
func dbSession() (sql.DB, error) {
return sql.Open("mysql", "credentials")
}
func ReadAll() {
db, err := dbSession()
if err != nil {
log.Fatal(err)
}
defer db.Close()
}
And if you want even more flexibility, you can have a struct that contains the data you need, then build your DB connection parameters from that.
type dbData struct {
DBType, DBName, User, Host, Password string
}
var DBData dbData
func dbSession() (*sql.DB, error) {
return sql.Open(DBData.DBType, fmt.Sprintf("%s:%s#tcp(%s)/%s", DBData.User, DBData.Password, DBData.Host, DBData.DBName)
}
Also note the following in the documentation from sql.Open:
The returned DB is safe for concurrent use by multiple goroutines and
maintains its own pool of idle connections. Thus, the Open function
should be called just once. It is rarely necessary to close a DB.
you can easily create a new File with your credentials. Just have the file be in the main package main.
package main
var myDBConnectionString := "mysql://...."
This will be included when you compile your source.
The problem is, that you have to recompile your code everytime you have to connect to another database. Think about a development System vs. production System. The database credentials should differ in those systems, right? :)
To fix this, it is quit common to have a config file. So you can change the credentials with out re compiling your code.
I've got an other idea - just connect to the db once, and access this resource globally.
package main
import (
"fmt"
)
var myDb = "example"
func main() {
fmt.Println("Hello, playground")
doSomthingWithDatabase()
}
func doSomthingWithDatabase() {
fmt.Println("We can access a global variable here, see", myDb)
}
https://play.golang.org/p/npZ6Z49ink
For the configuration handling you can look here
https://blog.gopheracademy.com/advent-2014/reading-config-files-the-go-way/
hiboot-data provides out of the box starter that meet your requirement, the starter is github.com/hidevopsio/hiboot-data/starter/gorm, or you can implement your own starter by using hiboot framework, then you can inject then anywhere to decouple from the creation of the database configuration.
package service
import (
"errors"
"hidevops.io/hiboot-data/examples/gorm/entity"
"hidevops.io/hiboot-data/starter/gorm"
"hidevops.io/hiboot/pkg/app"
"hidevops.io/hiboot/pkg/utils/idgen"
)
type UserService interface {
AddUser(user *entity.User) (err error)
GetUser(id uint64) (user *entity.User, err error)
GetAll() (user *[]entity.User, err error)
DeleteUser(id uint64) (err error)
}
type UserServiceImpl struct {
// add UserService, it means that the instance of UserServiceImpl can be found by UserService
UserService
repository gorm.Repository
}
func init() {
// register UserServiceImpl
app.Component(newUserService)
}
// will inject BoltRepository that configured in github.com/hidevopsio/hiboot/pkg/starter/data/bolt
func newUserService(repository gorm.Repository) UserService {
repository.AutoMigrate(&entity.User{})
return &UserServiceImpl{
repository: repository,
}
}
func (s *UserServiceImpl) AddUser(user *entity.User) (err error) {
if user == nil {
return errors.New("user is not allowed nil")
}
if user.Id == 0 {
user.Id, _ = idgen.Next()
}
err = s.repository.Create(user).Error()
return
}
func (s *UserServiceImpl) GetUser(id uint64) (user *entity.User, err error) {
user = &entity.User{}
err = s.repository.Where("id = ?", id).First(user).Error()
return
}
func (s *UserServiceImpl) GetAll() (users *[]entity.User, err error) {
users = &[]entity.User{}
err = s.repository.Find(users).Error()
return
}
func (s *UserServiceImpl) DeleteUser(id uint64) (err error) {
err = s.repository.Where("id = ?", id).Delete(entity.User{}).Error()
return
}

Sharing of conn pointer between interfaces in golang

What I'm trying to accomplish is sharing a pointer of db.sqlx between multiple functions, except for posts saying pass along the pointer, which is fine but how to do that in an interface? I cannot find anything that illustrates the use of this anywhere. Basically what I have is an interface of type Datastore. I also have mysql & pgsql that implements the Datastore type. The interface by itself works fine however the issue is I'm trying to create a single connect function for *sqlx.DB to be shared across all functions within the implemented interface. I think the issue is I've confused myself on how to share the pointer between functions of the interface or even "where" to share it. The main interface looks like below:
var (
storage Datastore
db * sqlx.DB
)
type Datastore interface {
Insert(db *sqlx.DB, table string, item DataItem) bool
CheckEmpty(db *sqlx.DB, table string) bool
FetchAll(db *sqlx.DB, table string) []DataItem
DBInit(db *sqlx.DB)
initDB()
}
Within my implemented interface (simplified mysql example) I have the initDB function which looks like this:
type MySQLDB struct {
config *config.Configuration
}
func (my *MySQLDB) initDB() {
log.Println("Getting DB Connection")
tempdb, err := sqlx.Connect("mysql", my.config.Database.Dsn+"&parseTime=True")
if err != nil {
log.Println(err.Error())
}
db = tempdb
defer db.Close()
}
func (my *MySQLDB) FetchAll(db *sqlx.DB, table string) []DataItem {
dTable := []DataItem{}
query := "SELECT foo, bar FROM " + table + " ORDER BY last_update ASC"
err := db.Select(&dTable, query)
if err != nil{
panic(err)
}
return dTable
}
At this point I know the connection is initially opened but the next time a function is called I get db is closed error. So how do I properly share the db connection between functions, or do I really have to run a connection open in every function?
Don't call defer db.Close() in your initDB function. After that function executed, db will close too! So when you call your method you get the closed error.
Maybe you need to re-desgin your interface, for example:
type Datastore interface {
Insert(table string, item DataItem) bool
CheckEmpty(table string) bool
FetchAll(table string) []DataItem
Close() error // call this method when you want to close the connection
initDB()
}
Your MySQLDB implement now look like:
type MySQLDB struct {
config *config.Configuration
db *sqlx.DB
}
func (my *MySQLDB) initDB() {
log.Println("Getting DB Connection")
tempdb, err := sqlx.Connect("mysql", my.config.Database.Dsn+"&parseTime=True")
if err != nil {
log.Println(err.Error())
}
my.db = tempdb
}
func (my *MySQLDB) Close() error {
return my.db.Close()
}
func (my *MySQLDB) FetchAll(table string) []DataItem {
dTable := []DataItem{}
query := "SELECT foo, bar FROM " + table + " ORDER BY last_update ASC"
err := my.db.Select(&dTable, query)
if err != nil{
panic(err)
}
return dTable
}

Resources