cockroarch DB "conn closed" - go

var db *pgx.Conn //database
func ConnectCockroachDB() {
var dbUri string = "testurl"
conn, err := pgx.Connect(context.Background(), dbUri)
if err != nil {
fmt.Fprintf(os.Stderr, "Unable to connect to database: %v\n", err)
os.Exit(1)
}
db = conn
}
//returns a handle to the DB object
func GetDB() *pgx.Conn {
if db == nil {
ConnectCockroachDB()
}
return db
}
but after i restart the DB, the application can not connect automatically. the error is "conn closed". i need to restart the application to make the app working

to just answer your question:
func GetDB() *pgx.Conn {
if db == nil || db.IsClosed() {
ConnectCockroachDB()
}
return db
}
However, this code is not safe for use from multiple goroutines. There should be a lock for accessing and updating the global db instance to avoid race conditions.

Since you're using pgx, you might want to try using pgxpool to connect: https://pkg.go.dev/github.com/jackc/pgx/v4/pgxpool
It will automatically take care of creating a new connection if the old one got closed. It also already supports concurrent usage.

Related

Can I connect to Memgraph using Go?

I'd like to connect from Go to the running instance of the Memgraph database. I'm using Docker and I've installed the Memgraph Platform. What exactly do I need to do?
The procedure for connecting fro Go to Memgraph is rather simple. For this you need to use Bolt protocol. Here are the needed steps:
First, create a new directory for your app, /MyApp, and position yourself in it. Next, create a program.go file with the following code:
package main
import (
"fmt"
"github.com/neo4j/neo4j-go-driver/v4/neo4j"
)
func main() {
dbUri := "bolt://localhost:7687"
driver, err := neo4j.NewDriver(dbUri, neo4j.BasicAuth("username", "password", ""))
if err != nil {
panic(err)
}
// Handle driver lifetime based on your application lifetime requirements driver's lifetime is usually
// bound by the application lifetime, which usually implies one driver instance per application
defer driver.Close()
item, err := insertItem(driver)
if err != nil {
panic(err)
}
fmt.Printf("%v\n", item.Message)
}
func insertItem(driver neo4j.Driver) (*Item, error) {
// Sessions are short-lived, cheap to create and NOT thread safe. Typically create one or more sessions
// per request in your web application. Make sure to call Close on the session when done.
// For multi-database support, set sessionConfig.DatabaseName to requested database
// Session config will default to write mode, if only reads are to be used configure session for
// read mode.
session := driver.NewSession(neo4j.SessionConfig{})
defer session.Close()
result, err := session.WriteTransaction(createItemFn)
if err != nil {
return nil, err
}
return result.(*Item), nil
}
func createItemFn(tx neo4j.Transaction) (interface{}, error) {
records, err := tx.Run(
"CREATE (a:Greeting) SET a.message = $message RETURN 'Node ' + id(a) + ': ' + a.message",
map[string]interface{}{"message": "Hello, World!"})
// In face of driver native errors, make sure to return them directly.
// Depending on the error, the driver may try to execute the function again.
if err != nil {
return nil, err
}
record, err := records.Single()
if err != nil {
return nil, err
}
// You can also retrieve values by name, with e.g. `id, found := record.Get("n.id")`
return &Item{
Message: record.Values[0].(string),
}, nil
}
type Item struct {
Message string
}
Now, create a go.mod file using the go mod init example.com/hello command.
I've mentioned the Bolt driver earlier. You need to add it with go get github.com/neo4j/neo4j-go-driver/v4#v4.3.1. You can run your program with go run .\program.go.
The complete documentation is located at Memgraph site.

gorm multiple databases connection management

I have a requirement where my application talks to different
databases . How do i manage connections in the gorm. Is there any
way gorm supports connection management for multiple database. or i
need to create map which holds all database connections.
if val, ok := selector.issure_db[issuer]; ok {
return val , nil;
} else {
var dbo *db.DB;
selector.mu.Lock()
dbo, err := db.NewDb(Config)
if err != nil {
boot.Logger(ctx).Fatal(err.Error())
}
selector.issure_db[issuer] = dbo;
selector.mu.Unlock()
return repo ,nil;
}
Is there is a better way to do this?
You can use the dbresolver plugin for GORM. It manages multiple sources and replicas and maintains an underlying connection pool for the group. You can even map models in your app to the correct database using the config.
Example from the docs:
import (
"gorm.io/gorm"
"gorm.io/plugin/dbresolver"
"gorm.io/driver/mysql"
)
db, err := gorm.Open(mysql.Open("db1_dsn"), &gorm.Config{})
db.Use(dbresolver.Register(dbresolver.Config{
// use `db2` as sources, `db3`, `db4` as replicas
Sources: []gorm.Dialector{mysql.Open("db2_dsn")},
Replicas: []gorm.Dialector{mysql.Open("db3_dsn"), mysql.Open("db4_dsn")},
// sources/replicas load balancing policy
Policy: dbresolver.RandomPolicy{},
}).Register(dbresolver.Config{
// use `db1` as sources (DB's default connection), `db5` as replicas for `User`, `Address`
Replicas: []gorm.Dialector{mysql.Open("db5_dsn")},
}, &User{}, &Address{}).Register(dbresolver.Config{
// use `db6`, `db7` as sources, `db8` as replicas for `orders`, `Product`
Sources: []gorm.Dialector{mysql.Open("db6_dsn"), mysql.Open("db7_dsn")},
Replicas: []gorm.Dialector{mysql.Open("db8_dsn")},
}, "orders", &Product{}, "secondary"))
You can create a package called database and write an init function in the init.go file which can create a DB object to connect with database for each database you have. And you can use this db object everywhere in the application which would enable connection pooling as well.
init.go
var db *gorm.DB
func init() {
var err error
dataSourceName := fmt.Sprintf("%s:%s#tcp(%s:%d)/%s?charset=utf8&parseTime=True&loc=Local", dbUser, dbPassword, dbHost, dbPort, dbName)
db, err = gorm.Open("mysql", dataSourceName)
db.DB().SetConnMaxLifetime(10 * time.Second)
db.DB().SetMaxIdleConns(10)
//initialise other db objects here
}
users.go
func getFirstUser() (user User) {
db.First(&user)
return
}
PS> This solution would be efficient if you have to connect to 1 or 2 database. If you need to connect to multiple databases at the same time, you should be using dbresolver plugin.
Old Answer
You can write a separate function which returns current database connection object every time you call the function.
func getDBConnection(dbUser, dbPassword, dbHost, dbName string) (db *gorm.DB, err error) {
dataSourceName := fmt.Sprintf("%s:%s#tcp(%s:%d)/%s?charset=utf8&parseTime=True&loc=Local", dbUser, dbPassword, dbHost, dbPort, dbName)
db, err = gorm.Open("mysql", dataSourceName)
db.DB().SetConnMaxLifetime(10 * time.Second)
return
}
And call defer db.Close() everytime after you call the getDBConnection function.
func getFirstUser() (user User) {
db, _ := getDBConnection()
defer db.Close()
db.First(&user)
return
}
This way your connections will be closed every time after you have executed the query.

What is the best way to use gorm in multithreaded application?

I have an application that opens a lot routines. Lets say 2000 routines. Each routine needs access to DB, or at least needs update/select data from DB.
My current approach is the following:
Routine gets *gorm.DB with db.GetConnection(), this is the code of this function:
func GetConnection() *gorm.DB {
DBConfig := config.GetConfig().DB
db, err := gorm.Open("mysql", DBConfig.DBUser+":"+DBConfig.DBPassword+"#/"+DBConfig.DBName+"?charset=utf8mb4")
if err != nil {
panic(err.Error())
}
return db
}
then routines calls another function from some storage package and passes *gorm.DB to function and closes the connection, it looks like that:
dbConnection := db.GetConnection()
postStorage.UpdateSomething(dbConnection)
db.CloseConnection(dbConnection)
The above is only example, the main idea is that each routine opens new connection and I don't like it. Because it may overload the DB. In result I got the next MySQL error:
[mysql] 2020/07/16 19:34:26 packets.go:37: read tcp 127.0.0.1:44170->127.0.0.1:3306: read: connection reset by peer
The question is about good pattern how to use gorm package in multiroutines application ?
*gorm.DB is multi thread safe, and you could use one *gorm.DB in multi routines. You could init it once and get it whenever you want. Demo:
package db
var db *gorm.DB
fund init() {
DBConfig := config.GetConfig().DB
db, err := gorm.Open("mysql", DBConfig.DBUser+":"+DBConfig.DBPassword+"#/"+DBConfig.DBName+"?charset=utf8mb4")
if err != nil {
panic(err.Error())
}
}
func GetConnection() *gorm.DB {
return db;
}

Using Ping to find out if DB Connection is alive in Golang

Is it safe to just ping the database to check if my golang app is still connected or is there a better solution than this? I've read somewhere that we should not use .ping() to determine a connection loss.
Thanks.
I would say Ping() is the way to do it if you need to test the connection separate of running queries probably at program start only.
Normally I just trust in the fact that database/sql will automatically try to reconnect in case if you are executing queries against DB and connection fails. So you could just use Open to check DB connection args are correct and trust query to return an error in case connection is lost.
People say Ping() can cause race conditions but cannot demonstrate me how or provide a suitable alternative where connection test needed.
This is how Gorm (widely used Golang ORM project) does it:
// Send a ping to make sure the database connection is alive.
if d, ok := dbSQL.(*sql.DB); ok {
if err = d.Ping(); err != nil {
d.Close()
}
}
return
https://github.com/jinzhu/gorm
Probably better to execute a simple query and check the result. The Ping() method is safe to use, but is optional to implement by database drivers.
See Ping()
You can ping the database, test it, close the connection, and test it again. Example:
database.go
package main
import (
"log"
"database/sql"
_ "modernc.org/sqlite"
)
func sqliteConnect() (sqliteDB *sql.DB) {
// driver and data source names
sqliteDB, err := sql.Open("sqlite", "./localdatabase.db")
if err != nil {
log.Fatalf("Cannot connect to sqlite DB: %v", err)
}
return sqliteDB
}
database_test.go
package main
import (
"testing"
)
func TestSqliteConnect(t *testing.T) {
db := sqliteConnect()
db.SetMaxIdleConns(0)
t.Run("Ping database", func(t *testing.T) {
if db.Ping() != nil {
t.Errorf("Unable to ping sqlite database ... %v", db.Ping())
}
})
db.Close()
t.Run("Check Connections", func(t *testing.T) {
if db.Stats().OpenConnections != 0 {
t.Errorf("Unable to close database connections ... %v", db.Stats().OpenConnections)
}
})
}

Golang how can I do a Dependency Injection to store some string values

I am using a mysql database and have many different Functions/Methods that interact with the database. For every Function I offcourse have to supply the Database Credentials such as
ReadAll.go
func ReadAll() {
db, err := sql.Open("mysql",
"user:password#tcp(127.0.0.1:3306)/hello")
if err != nil {
log.Fatal(err)
}
defer db.Close()
}
The part of "mysql",
"user:password#tcp(127.0.0.1:3306)/hello" never changes and I am supplying that to every Function that interacts with DB. I was wondering how can I for instance create a new File say DataBase.go put those credentials into some global variable and then reference when I need those strings ? That way if I have to change the credentials I only have to change them in 1 place.
I want to do something like
Database.go
const GlobalDB := "mysql","user:password#tcp(127.0.0.1:3306)/hello"
then
ReadAll.go
func ReadAll() {
db, err := sql.Open(GlobalDB)
if err != nil {
log.Fatal(err)
}
defer db.Close()
}
I am brand new to Golang but trying to figure this out.
I would probably do this by opening a session to the database once, then pass this session around to any function or method that may need it. This has a few potential problems:
You may need to lock access to it, so you don't co-mingle multiple queries on the same session (but it may be that your DB library ensures this, FWIW "database/sql" is concurrency-safe and recommends NOT opening short-lived database connections)
You can't safely close the session as it may still be in use elsewhere.
Another way would be to have a function that returns a DB sesssion, so instead of doing:
db, err := sql.Open("mysql", "user:password#tcp(127.0.0.1:3306)/hello")
You do the following:
func dbSession() (sql.DB, error) {
return sql.Open("mysql", "credentials")
}
func ReadAll() {
db, err := dbSession()
if err != nil {
log.Fatal(err)
}
defer db.Close()
}
And if you want even more flexibility, you can have a struct that contains the data you need, then build your DB connection parameters from that.
type dbData struct {
DBType, DBName, User, Host, Password string
}
var DBData dbData
func dbSession() (*sql.DB, error) {
return sql.Open(DBData.DBType, fmt.Sprintf("%s:%s#tcp(%s)/%s", DBData.User, DBData.Password, DBData.Host, DBData.DBName)
}
Also note the following in the documentation from sql.Open:
The returned DB is safe for concurrent use by multiple goroutines and
maintains its own pool of idle connections. Thus, the Open function
should be called just once. It is rarely necessary to close a DB.
you can easily create a new File with your credentials. Just have the file be in the main package main.
package main
var myDBConnectionString := "mysql://...."
This will be included when you compile your source.
The problem is, that you have to recompile your code everytime you have to connect to another database. Think about a development System vs. production System. The database credentials should differ in those systems, right? :)
To fix this, it is quit common to have a config file. So you can change the credentials with out re compiling your code.
I've got an other idea - just connect to the db once, and access this resource globally.
package main
import (
"fmt"
)
var myDb = "example"
func main() {
fmt.Println("Hello, playground")
doSomthingWithDatabase()
}
func doSomthingWithDatabase() {
fmt.Println("We can access a global variable here, see", myDb)
}
https://play.golang.org/p/npZ6Z49ink
For the configuration handling you can look here
https://blog.gopheracademy.com/advent-2014/reading-config-files-the-go-way/
hiboot-data provides out of the box starter that meet your requirement, the starter is github.com/hidevopsio/hiboot-data/starter/gorm, or you can implement your own starter by using hiboot framework, then you can inject then anywhere to decouple from the creation of the database configuration.
package service
import (
"errors"
"hidevops.io/hiboot-data/examples/gorm/entity"
"hidevops.io/hiboot-data/starter/gorm"
"hidevops.io/hiboot/pkg/app"
"hidevops.io/hiboot/pkg/utils/idgen"
)
type UserService interface {
AddUser(user *entity.User) (err error)
GetUser(id uint64) (user *entity.User, err error)
GetAll() (user *[]entity.User, err error)
DeleteUser(id uint64) (err error)
}
type UserServiceImpl struct {
// add UserService, it means that the instance of UserServiceImpl can be found by UserService
UserService
repository gorm.Repository
}
func init() {
// register UserServiceImpl
app.Component(newUserService)
}
// will inject BoltRepository that configured in github.com/hidevopsio/hiboot/pkg/starter/data/bolt
func newUserService(repository gorm.Repository) UserService {
repository.AutoMigrate(&entity.User{})
return &UserServiceImpl{
repository: repository,
}
}
func (s *UserServiceImpl) AddUser(user *entity.User) (err error) {
if user == nil {
return errors.New("user is not allowed nil")
}
if user.Id == 0 {
user.Id, _ = idgen.Next()
}
err = s.repository.Create(user).Error()
return
}
func (s *UserServiceImpl) GetUser(id uint64) (user *entity.User, err error) {
user = &entity.User{}
err = s.repository.Where("id = ?", id).First(user).Error()
return
}
func (s *UserServiceImpl) GetAll() (users *[]entity.User, err error) {
users = &[]entity.User{}
err = s.repository.Find(users).Error()
return
}
func (s *UserServiceImpl) DeleteUser(id uint64) (err error) {
err = s.repository.Where("id = ?", id).Delete(entity.User{}).Error()
return
}

Resources