How to dry up database code - go

I have a database package that contains the following code.
package database
import (
"log"
"github.com/jinzhu/gorm"
// required by gorm
_ "github.com/mattn/go-sqlite3"
)
type Podcast struct {
ID int `sql:"index"`
Title string
RssURL string `sql:"unique_index"`
Paused bool
Episodes []Episode
}
type Episode struct {
ID int `sql:"index"`
PodcastID int
Title string
EnclosureURL string `sql:"unique_index"`
Downloaded bool
GUID string `sql:"unique_index"`
PubDate string
}
func DBSession() (db gorm.DB) {
sqliteSession, err := gorm.Open("sqlite3", cache.db)
if err != nil {
log.Fatal(err)
}
return sqliteSession
}
Followed by a bunch of methods that all start with the following code.
FindSomethingByID(id int) {
db := DBSession()
db.LogMode(false)
// code
}
FindSomethingElse {
db := DBSession()
db.LogMode(false)
// code
}
Calling DBSession and setting LogMode in each func seems bad. I just don't know how to do it better. Could someone help?

Calling gorm.Open inside every function isn't very efficient: Open opens a new connection pool, and should be called just once (see the database/sql docs, which gorm wraps).
A simple improvement is to establish a global gorm.DB, initialise it in init() it from all of your functions - e.g.
package database
var db gorm.DB
func init() {
var err error
// Note we use an = and not a := as our variables
// are already initialised
db, err = gorm.Open("sqlite3", "cache.db")
if err != nil {
log.Fatal(err)
}
// Turn off logging globally
db.LogMode(false)
}
FindSomethingByID(id int) {
err := db.Query("...")
// code
}
This is a quick win and reduces the repetition.
In a larger application it typically makes sense to pass dependencies (like DB pools, config params, etc.) more explicitly by wrapping them in types and creating custom handlers.
You also might initialise the connection in your package main and pass the *gorm.DB to your database package via a func New(db *gorm.DB) function that sets a private, package-level variable.

The most obvious simplification would be to move the db.LogMode(false) call into the DBSession() function, and give DBSession() a shorter name like DB():
func DB() (db gorm.DB) {
sqliteSession, err := gorm.Open("sqlite3", cache.db)
if err != nil {
log.Fatal(err)
}
sqliteSession.LogMode(false)
return sqliteSession
}
And using it:
FindSomethingByID(id int) {
db := DB()
// code
}
Now there's only 1 line in each of your functions using the db session, one simple function call. You can't really make it any shorter if you always need a new db session.

Related

How to query a GORM Model

I'm attempting to query a database given a model, and I'm getting a blank model as a response. I'm using Revel framework and GORM to create the model and query the database.
app.go
package controllers
import (
"github.com/revel/revel"
"route/to/models"
)
type App struct {
*revel.Controller
GormController
}
func (c App) Index() revel.Result {
accounts := models.Account{}
account := c.DB.Find(accounts)
return c.RenderJSON(account)
}
gorm.go
package controllers
import (
_ "github.com/jinzhu/gorm/dialects/postgres"
"github.com/jinzhu/gorm"
r "github.com/revel/revel"
)
type GormController struct {
*r.Controller
DB *gorm.DB
}
// it can be used for jobs
var Gdb *gorm.DB
// init db
func InitDB() {
var err error
// open db
Gdb, err = gorm.Open("postgres", "host=hostname port=5432 user=postgres password=password dbname=some_db_name sslmode=disable")
Gdb.LogMode(true) // Print SQL statements
if err != nil {
r.ERROR.Println("FATAL", err)
panic(err)
}
}
func (c *GormController) SetDB() r.Result {
c.DB = Gdb
return nil
}
Not sure where I'm going wrong, ultimately I want to do a select * from accounts;
Note: I can run a migration and create in the Index() function and the data is handled correctly.
When populating a struct from the database, you need to pass a pointer for GORM to be able to populate it. Otherwise it's pass-by-value and GORM populates a copy and you never see it.
And if you want to get all accounts you should pass a pointer to a slice of accounts. Regarding pointers versus values, a slice is different from a struct in that the underlying array can still be modified even in pass-by-value, but outside the function the slice length (and capacity) won't change even as the array is populated within the function, so it still won't behave as expected without a pointer.
And you should check Error afterward:
accounts := make([]models.Account, 0)
if err := c.DB.Find(&accounts).Error; err != nil {
if gorm.IsRecordNotFoundError(err) {
// handle not found
} else {
// print/log/return error
}
return
}
// do something with accounts

Wrapping a db object in Go and running two methods in the same transaction

In the effort of learning Go a bit better, I am trying to refactor a series of functions which accept a DB connection as the first argument into struct methods and something a bit more "idiomatically" Go.
Right now my "data store" methods are something like this:
func CreateA(db orm.DB, a *A) error {
db.Exec("INSERT...")
}
func CreateB(db orm.DB, b *B) error {
db.Exec("INSERT...")
}
These the functions work perfectly fine. orm.DB is the DB interface of go-pg.
Since the two functions accept a db connection I can either pass an actual connection or a transaction (which implements the same interface). I can be sure that both functions issuing SQL INSERTs run in the same transaction, avoiding having inconsistent state in the DB in case either one of them fails.
The trouble started when I decided to read more about how to structure the code a little better and to make it "mockable" in case I need to.
So I googled a bit, read the article Practical Persistence in Go: Organising Database Access and tried to refactor the code to use proper interfaces.
The result is something like this:
type Store {
CreateA(a *A) error
CreateB(a *A) error
}
type DB struct {
orm.DB
}
func NewDBConnection(p *ConnParams) (*DB, error) {
.... create db connection ...
return &DB{db}, nil
}
func (db *DB) CreateA(a *A) error {
...
}
func (db *DB) CreateB(b *B) error {
...
}
which allows me to write code like:
db := NewDBConnection()
DB.CreateA(a)
DB.CreateB(b)
instead of:
db := NewDBConnection()
CreateA(db, a)
CreateB(db, b)
The actual issue is that I lost the ability to run the two functions in the same transaction. Before I could do:
pgDB := DB.DB.(*pg.DB) // convert the interface to an actual connection
pgDB.RunInTransaction(func(tx *pg.Tx) error {
CreateA(tx, a)
CreateB(tx, b)
})
or something like:
tx := db.DB.Begin()
err = CreateA(tx, a)
err = CreateB(tx, b)
if err != nil {
tx.Rollback()
} else {
tx.Commit()
}
which is more or less the same thing.
Since the functions were accepting the common interface between a connection and a transaction I could abstract from my model layer the transaction logic sending down either a full connection or a transaction. This allowed me to decide in the "HTTP handler" when to create a trasaction and when I didn't need to.
Keep in mind that the connection is a global object representing a pool of connections handled automatically by go, so the hack I tried:
pgDB := DB.DB.(*pg.DB) // convert the interface to an actual connection
err = pgDB.RunInTransaction(func(tx *pg.Tx) error {
DB.DB = tx // replace the connection with a transaction
DB.CreateA(a)
DB.CreateB(a)
})
it's clearly a bad idea, because although it works, it works only once because we replace the global connection with a transaction. The following request breaks the server.
Any ideas? I can't find information about this around, probably because I don't know the right keywords being a noob.
I've done something like this in the past (using the standard sql package, you may need to adapt it to your needs):
var ErrNestedTransaction = errors.New("nested transactions are not supported")
// abstraction over sql.TX and sql.DB
// a similar interface seems to be already defined in go-pg. So you may not need this.
type executor interface {
Exec(query string, args ...interface{}) (sql.Result, error)
Query(query string, args ...interface{}) (*sql.Rows, error)
QueryRow(query string, args ...interface{}) *sql.Row
}
type Store struct {
// this is the actual connection(pool) to the db which has the Begin() method
db *sql.DB
executor executor
}
func NewStore(dsn string) (*Store, error) {
db, err := sql.Open("sqlite3", dsn)
if err != nil {
return nil, err
}
// the initial store contains just the connection(pool)
return &Store{db, db}, nil
}
func (s *Store) RunInTransaction(f func(store *Store) error) error {
if _, ok := s.executor.(*sql.Tx); ok {
// nested transactions are not supported!
return ErrNestedTransaction
}
tx, err := s.db.Begin()
if err != nil {
return err
}
transactedStore := &Store{
s.db,
tx,
}
err = f(transactedStore)
if err != nil {
tx.Rollback()
return err
}
return tx.Commit()
}
func (s *Store) CreateA(thing A) error {
// your implementation
_, err := s.executor.Exec("INSERT INTO ...", ...)
return err
}
And then you use it like
// store is a global object
store.RunInTransaction(func(store *Store) error {
// this instance of Store uses a transaction to execute the methods
err := store.CreateA(a)
if err != nil {
return err
}
return store.CreateB(b)
})
The trick is to use the executor instead of the *sql.DB in your CreateX methods, which allows you to dynamically change the underlying implementation (tx vs. db). However, since there is very little information out there on how to deal with this issue, I can't assure you that this is the "best" solution. Other suggestions are welcome!

Golang Standard Package Structure

I am faily new to Go and I am trying to create a structured application using guidance from Ben Johnson's webpage. Unfortunately, his example is not a complete working application.
His webpage is https://medium.com/#benbjohnson/standard-package-layout-7cdbc8391fc1
I have tried to use his methods and I keep getting "Undefined: db" error. It doesn't tell me what line is causing the error, just the file "MSSQL.go"
Could someone help with guidance to help me fix this error?
Edited code with accepted solution.
StatementPrinter.go
package statementprinter
type Statement struct {
CustomerId string
CustomerName string
}
type StatementService interface {
Statement(id string) (*Statement, error)
}
main.go
package main
import (
"fmt"
"log"
"github.com/ybenjolin/StatementPrinter"
"github.com/ybenjolin/StatementPrinter/mssql"
"database/sql"
_ "github.com/alexbrainman/odbc"
)
const DB_INFO = "Driver={SQL Server};Server=cdc-edb2;Database=CostarReports;Trusted_Connection=yes;"
var db *sql.DB
func init() {
var err error
db, err = sql.Open("odbc", DB_INFO)
if err != nil {
log.Fatal("Error opening database connection.\n", err.Error())
}
err = db.Ping()
if err != nil {
log.Fatal("Error pinging database server.\n", err.Error())
}
fmt.Println("Database connection established.")
}
func main () {
var err error
defer db.Close()
// Create services
// Changes required here. Was ss := &statementprinter.Stat..
ss := &mssql.StatementService{DB: db}
// Use service
var s *statementprinter.Statement
s, err = ss.Statement("101583")
if err != nil {
log.Fatal("Query failed:", err.Error())
}
fmt.Printf("Statement: %+v\n", s)
}
mssql.go
package mssql
import (
_ "github.com/alexbrainman/odbc"
"database/sql"
"github.com/ybenjolin/StatementPrinter"
)
// StatementService represents a MSSQL implementation of statemenetprinter.StatementService.
type StatementService struct {
DB *sql.DB
}
// Statement returns a statement for a given customer.
func (s *StatementService) Statement(customer string) (*statementprinter.Statement, error) {
var err error
var t statementprinter.Statement
// Changes required here. Was row := db.Query......
row := s.DB.QueryRow(`Select Customer, CustomerName From AccountsReceivable.rptfARStatementHeader(?)`, customer)
if row.Scan(&t.CustomerId, &t.CustomerName); err != nil {
return nil, err
}
return &t, nil
This seems like it's just a typo. It seems like the problematic line is in the method
func (s *StatementService) Statement(customer string)
in mssql.go,
row := db.QueryRow(`Select Customer, CustomerName From AccountsReceivable.rptfARStatementHeader(?)`, customer)
QueryRow is supposed to be a method of db, but db is not defined. However, in the struct
type StatementService struct {
DB *sql.DB
}
there's a *sql.DB instance. The method you're using has a *StatementService parameter, s. So, my guess is the intention would be to access the sql.DB field in s like so
func (s *StatementService) Statement(customer string) (*statementprinter.Statement, error) {
var err error
var t statementprinter.Statement
//CHANGED LINE:
row := s.DB.QueryRow(`Select Customer, CustomerName From AccountsReceivable.rptfARStatementHeader(?)`, customer)
if row.Scan(&t.CustomerId, &t.CustomerName); err != nil {
return nil, err
}
return &t, nil
Then, the method is called in main.go, and is passed a StatementService instance that contains a database:
ss := &statementprinter.StatementService{DB: db}
I believe you need to change this line to
ss := &mssql.StatementService{DB: db}
becuase that's the actual interface implementation. The line you have now treats the StatementService interface like a struct which will not compile.
The global db in main.go lives for the lifetime of the application. It's just a pointer which is copied around for use.

Golang how can I do a Dependency Injection to store some string values

I am using a mysql database and have many different Functions/Methods that interact with the database. For every Function I offcourse have to supply the Database Credentials such as
ReadAll.go
func ReadAll() {
db, err := sql.Open("mysql",
"user:password#tcp(127.0.0.1:3306)/hello")
if err != nil {
log.Fatal(err)
}
defer db.Close()
}
The part of "mysql",
"user:password#tcp(127.0.0.1:3306)/hello" never changes and I am supplying that to every Function that interacts with DB. I was wondering how can I for instance create a new File say DataBase.go put those credentials into some global variable and then reference when I need those strings ? That way if I have to change the credentials I only have to change them in 1 place.
I want to do something like
Database.go
const GlobalDB := "mysql","user:password#tcp(127.0.0.1:3306)/hello"
then
ReadAll.go
func ReadAll() {
db, err := sql.Open(GlobalDB)
if err != nil {
log.Fatal(err)
}
defer db.Close()
}
I am brand new to Golang but trying to figure this out.
I would probably do this by opening a session to the database once, then pass this session around to any function or method that may need it. This has a few potential problems:
You may need to lock access to it, so you don't co-mingle multiple queries on the same session (but it may be that your DB library ensures this, FWIW "database/sql" is concurrency-safe and recommends NOT opening short-lived database connections)
You can't safely close the session as it may still be in use elsewhere.
Another way would be to have a function that returns a DB sesssion, so instead of doing:
db, err := sql.Open("mysql", "user:password#tcp(127.0.0.1:3306)/hello")
You do the following:
func dbSession() (sql.DB, error) {
return sql.Open("mysql", "credentials")
}
func ReadAll() {
db, err := dbSession()
if err != nil {
log.Fatal(err)
}
defer db.Close()
}
And if you want even more flexibility, you can have a struct that contains the data you need, then build your DB connection parameters from that.
type dbData struct {
DBType, DBName, User, Host, Password string
}
var DBData dbData
func dbSession() (*sql.DB, error) {
return sql.Open(DBData.DBType, fmt.Sprintf("%s:%s#tcp(%s)/%s", DBData.User, DBData.Password, DBData.Host, DBData.DBName)
}
Also note the following in the documentation from sql.Open:
The returned DB is safe for concurrent use by multiple goroutines and
maintains its own pool of idle connections. Thus, the Open function
should be called just once. It is rarely necessary to close a DB.
you can easily create a new File with your credentials. Just have the file be in the main package main.
package main
var myDBConnectionString := "mysql://...."
This will be included when you compile your source.
The problem is, that you have to recompile your code everytime you have to connect to another database. Think about a development System vs. production System. The database credentials should differ in those systems, right? :)
To fix this, it is quit common to have a config file. So you can change the credentials with out re compiling your code.
I've got an other idea - just connect to the db once, and access this resource globally.
package main
import (
"fmt"
)
var myDb = "example"
func main() {
fmt.Println("Hello, playground")
doSomthingWithDatabase()
}
func doSomthingWithDatabase() {
fmt.Println("We can access a global variable here, see", myDb)
}
https://play.golang.org/p/npZ6Z49ink
For the configuration handling you can look here
https://blog.gopheracademy.com/advent-2014/reading-config-files-the-go-way/
hiboot-data provides out of the box starter that meet your requirement, the starter is github.com/hidevopsio/hiboot-data/starter/gorm, or you can implement your own starter by using hiboot framework, then you can inject then anywhere to decouple from the creation of the database configuration.
package service
import (
"errors"
"hidevops.io/hiboot-data/examples/gorm/entity"
"hidevops.io/hiboot-data/starter/gorm"
"hidevops.io/hiboot/pkg/app"
"hidevops.io/hiboot/pkg/utils/idgen"
)
type UserService interface {
AddUser(user *entity.User) (err error)
GetUser(id uint64) (user *entity.User, err error)
GetAll() (user *[]entity.User, err error)
DeleteUser(id uint64) (err error)
}
type UserServiceImpl struct {
// add UserService, it means that the instance of UserServiceImpl can be found by UserService
UserService
repository gorm.Repository
}
func init() {
// register UserServiceImpl
app.Component(newUserService)
}
// will inject BoltRepository that configured in github.com/hidevopsio/hiboot/pkg/starter/data/bolt
func newUserService(repository gorm.Repository) UserService {
repository.AutoMigrate(&entity.User{})
return &UserServiceImpl{
repository: repository,
}
}
func (s *UserServiceImpl) AddUser(user *entity.User) (err error) {
if user == nil {
return errors.New("user is not allowed nil")
}
if user.Id == 0 {
user.Id, _ = idgen.Next()
}
err = s.repository.Create(user).Error()
return
}
func (s *UserServiceImpl) GetUser(id uint64) (user *entity.User, err error) {
user = &entity.User{}
err = s.repository.Where("id = ?", id).First(user).Error()
return
}
func (s *UserServiceImpl) GetAll() (users *[]entity.User, err error) {
users = &[]entity.User{}
err = s.repository.Find(users).Error()
return
}
func (s *UserServiceImpl) DeleteUser(id uint64) (err error) {
err = s.repository.Where("id = ?", id).Delete(entity.User{}).Error()
return
}

Sharing of conn pointer between interfaces in golang

What I'm trying to accomplish is sharing a pointer of db.sqlx between multiple functions, except for posts saying pass along the pointer, which is fine but how to do that in an interface? I cannot find anything that illustrates the use of this anywhere. Basically what I have is an interface of type Datastore. I also have mysql & pgsql that implements the Datastore type. The interface by itself works fine however the issue is I'm trying to create a single connect function for *sqlx.DB to be shared across all functions within the implemented interface. I think the issue is I've confused myself on how to share the pointer between functions of the interface or even "where" to share it. The main interface looks like below:
var (
storage Datastore
db * sqlx.DB
)
type Datastore interface {
Insert(db *sqlx.DB, table string, item DataItem) bool
CheckEmpty(db *sqlx.DB, table string) bool
FetchAll(db *sqlx.DB, table string) []DataItem
DBInit(db *sqlx.DB)
initDB()
}
Within my implemented interface (simplified mysql example) I have the initDB function which looks like this:
type MySQLDB struct {
config *config.Configuration
}
func (my *MySQLDB) initDB() {
log.Println("Getting DB Connection")
tempdb, err := sqlx.Connect("mysql", my.config.Database.Dsn+"&parseTime=True")
if err != nil {
log.Println(err.Error())
}
db = tempdb
defer db.Close()
}
func (my *MySQLDB) FetchAll(db *sqlx.DB, table string) []DataItem {
dTable := []DataItem{}
query := "SELECT foo, bar FROM " + table + " ORDER BY last_update ASC"
err := db.Select(&dTable, query)
if err != nil{
panic(err)
}
return dTable
}
At this point I know the connection is initially opened but the next time a function is called I get db is closed error. So how do I properly share the db connection between functions, or do I really have to run a connection open in every function?
Don't call defer db.Close() in your initDB function. After that function executed, db will close too! So when you call your method you get the closed error.
Maybe you need to re-desgin your interface, for example:
type Datastore interface {
Insert(table string, item DataItem) bool
CheckEmpty(table string) bool
FetchAll(table string) []DataItem
Close() error // call this method when you want to close the connection
initDB()
}
Your MySQLDB implement now look like:
type MySQLDB struct {
config *config.Configuration
db *sqlx.DB
}
func (my *MySQLDB) initDB() {
log.Println("Getting DB Connection")
tempdb, err := sqlx.Connect("mysql", my.config.Database.Dsn+"&parseTime=True")
if err != nil {
log.Println(err.Error())
}
my.db = tempdb
}
func (my *MySQLDB) Close() error {
return my.db.Close()
}
func (my *MySQLDB) FetchAll(table string) []DataItem {
dTable := []DataItem{}
query := "SELECT foo, bar FROM " + table + " ORDER BY last_update ASC"
err := my.db.Select(&dTable, query)
if err != nil{
panic(err)
}
return dTable
}

Resources