I'm trying to create a global DB sql.Open connection between multiple .go packages.
Doing this will allow me to key in my DB connection only once instead of doing it multiple times between separate files.
I've created this file so far:
import (
"database/sql"
)
var DB *sql.DB
func ConnectDB() {
db, _ := sql.Open("mysql", "user:password#(localhost:port)/db")
DB = db
}
I'm assigning my connection to the global variable DB. I know that I could use DB.query or otherpkg.DB.
I can avoid the DB = db by not using the := and doing, the := assign a new variable into the function's scope:
var db *sql.DB
func ConnectDB() {
var err error
db, err = sql.Open(.....)
.....
}
My other files look like the following:
func TestNew() {
fmt.Println("Test for MySQL")
db, _ := sql.Open(
"mysql", "user:password#(localhost:port)/db")
Delete, err := db.Query("DELETE * FROM table1 WHERE id = ?;", id)
if err != nil {
panic(err.Error())
}
defer Delete.Close()
}
I don't want to go into each separate file inside of my packages and change the DB credentials 10x. I'd like to use the func ConnectDB and have that be the file which controls what DB/credentials are used for the other packages.
A common way of doing this is initializing the connection in main, and passing it down to all the functions that need it.
Another way of doing it is to use a package other than main, and set a package level variable there:
package db
var DB *sql.DB
func InitDB(dbURL string) {
var err error
DB, err=sql.Open(dbURL)
if err!=nil {
panic(err)
}
}
You call InitDB from main with the DB URL. When you need the DB connection, you import the db package, and use the connection as db.DB.
Related
I have a requirement where my application talks to different
databases . How do i manage connections in the gorm. Is there any
way gorm supports connection management for multiple database. or i
need to create map which holds all database connections.
if val, ok := selector.issure_db[issuer]; ok {
return val , nil;
} else {
var dbo *db.DB;
selector.mu.Lock()
dbo, err := db.NewDb(Config)
if err != nil {
boot.Logger(ctx).Fatal(err.Error())
}
selector.issure_db[issuer] = dbo;
selector.mu.Unlock()
return repo ,nil;
}
Is there is a better way to do this?
You can use the dbresolver plugin for GORM. It manages multiple sources and replicas and maintains an underlying connection pool for the group. You can even map models in your app to the correct database using the config.
Example from the docs:
import (
"gorm.io/gorm"
"gorm.io/plugin/dbresolver"
"gorm.io/driver/mysql"
)
db, err := gorm.Open(mysql.Open("db1_dsn"), &gorm.Config{})
db.Use(dbresolver.Register(dbresolver.Config{
// use `db2` as sources, `db3`, `db4` as replicas
Sources: []gorm.Dialector{mysql.Open("db2_dsn")},
Replicas: []gorm.Dialector{mysql.Open("db3_dsn"), mysql.Open("db4_dsn")},
// sources/replicas load balancing policy
Policy: dbresolver.RandomPolicy{},
}).Register(dbresolver.Config{
// use `db1` as sources (DB's default connection), `db5` as replicas for `User`, `Address`
Replicas: []gorm.Dialector{mysql.Open("db5_dsn")},
}, &User{}, &Address{}).Register(dbresolver.Config{
// use `db6`, `db7` as sources, `db8` as replicas for `orders`, `Product`
Sources: []gorm.Dialector{mysql.Open("db6_dsn"), mysql.Open("db7_dsn")},
Replicas: []gorm.Dialector{mysql.Open("db8_dsn")},
}, "orders", &Product{}, "secondary"))
You can create a package called database and write an init function in the init.go file which can create a DB object to connect with database for each database you have. And you can use this db object everywhere in the application which would enable connection pooling as well.
init.go
var db *gorm.DB
func init() {
var err error
dataSourceName := fmt.Sprintf("%s:%s#tcp(%s:%d)/%s?charset=utf8&parseTime=True&loc=Local", dbUser, dbPassword, dbHost, dbPort, dbName)
db, err = gorm.Open("mysql", dataSourceName)
db.DB().SetConnMaxLifetime(10 * time.Second)
db.DB().SetMaxIdleConns(10)
//initialise other db objects here
}
users.go
func getFirstUser() (user User) {
db.First(&user)
return
}
PS> This solution would be efficient if you have to connect to 1 or 2 database. If you need to connect to multiple databases at the same time, you should be using dbresolver plugin.
Old Answer
You can write a separate function which returns current database connection object every time you call the function.
func getDBConnection(dbUser, dbPassword, dbHost, dbName string) (db *gorm.DB, err error) {
dataSourceName := fmt.Sprintf("%s:%s#tcp(%s:%d)/%s?charset=utf8&parseTime=True&loc=Local", dbUser, dbPassword, dbHost, dbPort, dbName)
db, err = gorm.Open("mysql", dataSourceName)
db.DB().SetConnMaxLifetime(10 * time.Second)
return
}
And call defer db.Close() everytime after you call the getDBConnection function.
func getFirstUser() (user User) {
db, _ := getDBConnection()
defer db.Close()
db.First(&user)
return
}
This way your connections will be closed every time after you have executed the query.
I have an application that opens a lot routines. Lets say 2000 routines. Each routine needs access to DB, or at least needs update/select data from DB.
My current approach is the following:
Routine gets *gorm.DB with db.GetConnection(), this is the code of this function:
func GetConnection() *gorm.DB {
DBConfig := config.GetConfig().DB
db, err := gorm.Open("mysql", DBConfig.DBUser+":"+DBConfig.DBPassword+"#/"+DBConfig.DBName+"?charset=utf8mb4")
if err != nil {
panic(err.Error())
}
return db
}
then routines calls another function from some storage package and passes *gorm.DB to function and closes the connection, it looks like that:
dbConnection := db.GetConnection()
postStorage.UpdateSomething(dbConnection)
db.CloseConnection(dbConnection)
The above is only example, the main idea is that each routine opens new connection and I don't like it. Because it may overload the DB. In result I got the next MySQL error:
[mysql] 2020/07/16 19:34:26 packets.go:37: read tcp 127.0.0.1:44170->127.0.0.1:3306: read: connection reset by peer
The question is about good pattern how to use gorm package in multiroutines application ?
*gorm.DB is multi thread safe, and you could use one *gorm.DB in multi routines. You could init it once and get it whenever you want. Demo:
package db
var db *gorm.DB
fund init() {
DBConfig := config.GetConfig().DB
db, err := gorm.Open("mysql", DBConfig.DBUser+":"+DBConfig.DBPassword+"#/"+DBConfig.DBName+"?charset=utf8mb4")
if err != nil {
panic(err.Error())
}
}
func GetConnection() *gorm.DB {
return db;
}
I'm attempting to query a database given a model, and I'm getting a blank model as a response. I'm using Revel framework and GORM to create the model and query the database.
app.go
package controllers
import (
"github.com/revel/revel"
"route/to/models"
)
type App struct {
*revel.Controller
GormController
}
func (c App) Index() revel.Result {
accounts := models.Account{}
account := c.DB.Find(accounts)
return c.RenderJSON(account)
}
gorm.go
package controllers
import (
_ "github.com/jinzhu/gorm/dialects/postgres"
"github.com/jinzhu/gorm"
r "github.com/revel/revel"
)
type GormController struct {
*r.Controller
DB *gorm.DB
}
// it can be used for jobs
var Gdb *gorm.DB
// init db
func InitDB() {
var err error
// open db
Gdb, err = gorm.Open("postgres", "host=hostname port=5432 user=postgres password=password dbname=some_db_name sslmode=disable")
Gdb.LogMode(true) // Print SQL statements
if err != nil {
r.ERROR.Println("FATAL", err)
panic(err)
}
}
func (c *GormController) SetDB() r.Result {
c.DB = Gdb
return nil
}
Not sure where I'm going wrong, ultimately I want to do a select * from accounts;
Note: I can run a migration and create in the Index() function and the data is handled correctly.
When populating a struct from the database, you need to pass a pointer for GORM to be able to populate it. Otherwise it's pass-by-value and GORM populates a copy and you never see it.
And if you want to get all accounts you should pass a pointer to a slice of accounts. Regarding pointers versus values, a slice is different from a struct in that the underlying array can still be modified even in pass-by-value, but outside the function the slice length (and capacity) won't change even as the array is populated within the function, so it still won't behave as expected without a pointer.
And you should check Error afterward:
accounts := make([]models.Account, 0)
if err := c.DB.Find(&accounts).Error; err != nil {
if gorm.IsRecordNotFoundError(err) {
// handle not found
} else {
// print/log/return error
}
return
}
// do something with accounts
I am using a mysql database and have many different Functions/Methods that interact with the database. For every Function I offcourse have to supply the Database Credentials such as
ReadAll.go
func ReadAll() {
db, err := sql.Open("mysql",
"user:password#tcp(127.0.0.1:3306)/hello")
if err != nil {
log.Fatal(err)
}
defer db.Close()
}
The part of "mysql",
"user:password#tcp(127.0.0.1:3306)/hello" never changes and I am supplying that to every Function that interacts with DB. I was wondering how can I for instance create a new File say DataBase.go put those credentials into some global variable and then reference when I need those strings ? That way if I have to change the credentials I only have to change them in 1 place.
I want to do something like
Database.go
const GlobalDB := "mysql","user:password#tcp(127.0.0.1:3306)/hello"
then
ReadAll.go
func ReadAll() {
db, err := sql.Open(GlobalDB)
if err != nil {
log.Fatal(err)
}
defer db.Close()
}
I am brand new to Golang but trying to figure this out.
I would probably do this by opening a session to the database once, then pass this session around to any function or method that may need it. This has a few potential problems:
You may need to lock access to it, so you don't co-mingle multiple queries on the same session (but it may be that your DB library ensures this, FWIW "database/sql" is concurrency-safe and recommends NOT opening short-lived database connections)
You can't safely close the session as it may still be in use elsewhere.
Another way would be to have a function that returns a DB sesssion, so instead of doing:
db, err := sql.Open("mysql", "user:password#tcp(127.0.0.1:3306)/hello")
You do the following:
func dbSession() (sql.DB, error) {
return sql.Open("mysql", "credentials")
}
func ReadAll() {
db, err := dbSession()
if err != nil {
log.Fatal(err)
}
defer db.Close()
}
And if you want even more flexibility, you can have a struct that contains the data you need, then build your DB connection parameters from that.
type dbData struct {
DBType, DBName, User, Host, Password string
}
var DBData dbData
func dbSession() (*sql.DB, error) {
return sql.Open(DBData.DBType, fmt.Sprintf("%s:%s#tcp(%s)/%s", DBData.User, DBData.Password, DBData.Host, DBData.DBName)
}
Also note the following in the documentation from sql.Open:
The returned DB is safe for concurrent use by multiple goroutines and
maintains its own pool of idle connections. Thus, the Open function
should be called just once. It is rarely necessary to close a DB.
you can easily create a new File with your credentials. Just have the file be in the main package main.
package main
var myDBConnectionString := "mysql://...."
This will be included when you compile your source.
The problem is, that you have to recompile your code everytime you have to connect to another database. Think about a development System vs. production System. The database credentials should differ in those systems, right? :)
To fix this, it is quit common to have a config file. So you can change the credentials with out re compiling your code.
I've got an other idea - just connect to the db once, and access this resource globally.
package main
import (
"fmt"
)
var myDb = "example"
func main() {
fmt.Println("Hello, playground")
doSomthingWithDatabase()
}
func doSomthingWithDatabase() {
fmt.Println("We can access a global variable here, see", myDb)
}
https://play.golang.org/p/npZ6Z49ink
For the configuration handling you can look here
https://blog.gopheracademy.com/advent-2014/reading-config-files-the-go-way/
hiboot-data provides out of the box starter that meet your requirement, the starter is github.com/hidevopsio/hiboot-data/starter/gorm, or you can implement your own starter by using hiboot framework, then you can inject then anywhere to decouple from the creation of the database configuration.
package service
import (
"errors"
"hidevops.io/hiboot-data/examples/gorm/entity"
"hidevops.io/hiboot-data/starter/gorm"
"hidevops.io/hiboot/pkg/app"
"hidevops.io/hiboot/pkg/utils/idgen"
)
type UserService interface {
AddUser(user *entity.User) (err error)
GetUser(id uint64) (user *entity.User, err error)
GetAll() (user *[]entity.User, err error)
DeleteUser(id uint64) (err error)
}
type UserServiceImpl struct {
// add UserService, it means that the instance of UserServiceImpl can be found by UserService
UserService
repository gorm.Repository
}
func init() {
// register UserServiceImpl
app.Component(newUserService)
}
// will inject BoltRepository that configured in github.com/hidevopsio/hiboot/pkg/starter/data/bolt
func newUserService(repository gorm.Repository) UserService {
repository.AutoMigrate(&entity.User{})
return &UserServiceImpl{
repository: repository,
}
}
func (s *UserServiceImpl) AddUser(user *entity.User) (err error) {
if user == nil {
return errors.New("user is not allowed nil")
}
if user.Id == 0 {
user.Id, _ = idgen.Next()
}
err = s.repository.Create(user).Error()
return
}
func (s *UserServiceImpl) GetUser(id uint64) (user *entity.User, err error) {
user = &entity.User{}
err = s.repository.Where("id = ?", id).First(user).Error()
return
}
func (s *UserServiceImpl) GetAll() (users *[]entity.User, err error) {
users = &[]entity.User{}
err = s.repository.Find(users).Error()
return
}
func (s *UserServiceImpl) DeleteUser(id uint64) (err error) {
err = s.repository.Where("id = ?", id).Delete(entity.User{}).Error()
return
}
I have a database package that contains the following code.
package database
import (
"log"
"github.com/jinzhu/gorm"
// required by gorm
_ "github.com/mattn/go-sqlite3"
)
type Podcast struct {
ID int `sql:"index"`
Title string
RssURL string `sql:"unique_index"`
Paused bool
Episodes []Episode
}
type Episode struct {
ID int `sql:"index"`
PodcastID int
Title string
EnclosureURL string `sql:"unique_index"`
Downloaded bool
GUID string `sql:"unique_index"`
PubDate string
}
func DBSession() (db gorm.DB) {
sqliteSession, err := gorm.Open("sqlite3", cache.db)
if err != nil {
log.Fatal(err)
}
return sqliteSession
}
Followed by a bunch of methods that all start with the following code.
FindSomethingByID(id int) {
db := DBSession()
db.LogMode(false)
// code
}
FindSomethingElse {
db := DBSession()
db.LogMode(false)
// code
}
Calling DBSession and setting LogMode in each func seems bad. I just don't know how to do it better. Could someone help?
Calling gorm.Open inside every function isn't very efficient: Open opens a new connection pool, and should be called just once (see the database/sql docs, which gorm wraps).
A simple improvement is to establish a global gorm.DB, initialise it in init() it from all of your functions - e.g.
package database
var db gorm.DB
func init() {
var err error
// Note we use an = and not a := as our variables
// are already initialised
db, err = gorm.Open("sqlite3", "cache.db")
if err != nil {
log.Fatal(err)
}
// Turn off logging globally
db.LogMode(false)
}
FindSomethingByID(id int) {
err := db.Query("...")
// code
}
This is a quick win and reduces the repetition.
In a larger application it typically makes sense to pass dependencies (like DB pools, config params, etc.) more explicitly by wrapping them in types and creating custom handlers.
You also might initialise the connection in your package main and pass the *gorm.DB to your database package via a func New(db *gorm.DB) function that sets a private, package-level variable.
The most obvious simplification would be to move the db.LogMode(false) call into the DBSession() function, and give DBSession() a shorter name like DB():
func DB() (db gorm.DB) {
sqliteSession, err := gorm.Open("sqlite3", cache.db)
if err != nil {
log.Fatal(err)
}
sqliteSession.LogMode(false)
return sqliteSession
}
And using it:
FindSomethingByID(id int) {
db := DB()
// code
}
Now there's only 1 line in each of your functions using the db session, one simple function call. You can't really make it any shorter if you always need a new db session.