Typical way to clean up background work in a user-facing component - go

Assuming I want to return an instance of a "stateful" component to a user, what is the typical way I can cleanup/join background work within that instance? And are there any patterns to avoid viral propagation of explicit cleanup functions all the way to the root code?
For example, let's assume I am returning a client to a database to the user. In this client, I have a loop that periodically polls the server for updates. Now any time this exists within an ownership DAG (like as a member variable in another struct, or as a list in another struct). Requiring an explicit Close() will bubble up virally throughout the call stack. As each upwards link in the DAG will require a Close() as well. All the way to the function that owns the root instance (eg. main() will be required to call Close() on the root server instance, which will require an implementation of Close() so it cleans up background behind itself, etc). Something like the below
type DbClient struct { ... }
func Cleanup(client DbClient) { ... }
type Component struct {
client DbClient
...
}
func Cleanup(component Component) { ... }
type Server struct {
component Component
...
}
func Cleanup(server Server) { ... }
Is there any other way to handle these cases? Or is an explicit Close() function the recommendation for such stateful components?

I guess the problem you mentiond: "upwards link in the DAG will require a Close()" & "all the way to the func that owns root instance.
Go has struct embedding feature. Go favors composition over inheritance.
There's an important way in which embedding differs from subclassing. When we embed a type, the methods of that type become methods of the outer type, but when they are invoked the receiver of the method is the inner type, not the outer one.
package main
import "fmt"
type DbClient struct{}
func (client *DbClient) Cleanup() {
fmt.Println("Closed called on client")
}
type Component struct {
*DbClient
}
type Server struct {
*Component
}
func main() {
client := DbClient{}
component := Component{&client}
server := Server{&component}
server.Cleanup()
}

Related

How can I separate generated code package and user code but have them accessible from one place in code

I am newer to golang, so I have some courses that I bought from udemy to help break me into the language. One of them I found very helpful for a general understanding as I took on a project in the language.
In the class that I took, all of the sql related functions were in the sqlc folder with the structure less broken out:
sqlc
generatedcode
store
One of those files is a querier that is generated by sqlc that contains an interface with all of the methods that were generated. Here is the general idea of what it currently looks like: https://github.com/techschool/simplebank/tree/master/db/sqlc
package db
import (
"context"
"github.com/google/uuid"
)
type Querier interface {
AddAccountBalance(ctx context.Context, arg AddAccountBalanceParams) (Account, error)
CreateAccount(ctx context.Context, arg CreateAccountParams) (Account, error)
...
}
var _ Querier = (*Queries)(nil)
Would it be possible to wrap both what sqlc generates AND any queries that a developer creates (dynamic queries) into a single querier? I'm also trying to have it so that the sqlc generated code is in its own folder. The structure I am aiming for is:
sql
sqlc
generatedcode
store - (wraps it all together)
dynamicsqlfiles
This should clear up what a I mean by store: https://github.com/techschool/simplebank/blob/master/db/sqlc/store.go
package db
import (
"context"
"database/sql"
"fmt"
)
// Store defines all functions to execute db queries and transactions
type Store interface {
Querier
TransferTx(ctx context.Context, arg TransferTxParams) (TransferTxResult, error)
}
// SQLStore provides all functions to execute SQL queries and transactions
type SQLStore struct {
db *sql.DB
*Queries
}
// NewStore creates a new store
func NewStore(db *sql.DB) Store {
return &SQLStore{
db: db,
Queries: New(db),
}
}
I'm trying to run everything through that store (both generated and my functions), so I can make a call similar to the CreateUser function in this file (server.store.): https://github.com/techschool/simplebank/blob/master/api/user.go
arg := db.CreateUserParams{
Username: req.Username,
HashedPassword: hashedPassword,
FullName: req.FullName,
Email: req.Email,
}
user, err := server.store.CreateUser(ctx, arg)
if err != nil {
if pqErr, ok := err.(*pq.Error); ok {
switch pqErr.Code.Name() {
case "unique_violation":
ctx.JSON(http.StatusForbidden, errorResponse(err))
return
}
}
ctx.JSON(http.StatusInternalServerError, errorResponse(err))
return
}
I've tried creating something that houses another querier interface that embeds the generated one, then creating my own db.go that uses the generated DBTX interface but has its own Queries struct, and New function. It always gives me an error that the Queries struct I created aren't implementing the functions I made, despite having it implemented in one of the custom methods I made.
I deleted that branch, and have been clicking through the simplebank project linked above to see if I can find another way this could be done, or if I missed something. If it can't be done, that's okay. I'm just using this as a good opportunity to learn a little more about the language, and keep some code separated if possible.
UPDATE:
There were only a few pieces I had to change, but I modified the store.go to look more like:
// sdb is imported, but points to the generated Querier
// Store provides all functions to execute db queries and transactions
type Store interface {
sdb.Querier
DynamicQuerier
}
// SQLStore provides all functions to execute SQL queries and transactions
type SQLStore struct {
db *sql.DB
*sdb.Queries
*dynamicQueries
}
// NewStore creates a new Store
func NewStore(db *sql.DB) Store {
return &SQLStore{
db: db,
Queries: sdb.New(db),
dynamicQueries: New(db),
}
}
Then just created a new Querier and struct for the methods I would be creating. Gave them their own New function, and tied it together in the above. Before, I was trying to figure out a way to reuse as much of the generated code as possible, which I think was the issue.
Why I wanted the Interface:
I wanted a structure that separated the files I would be working in more from the files that I would never touch (generated). This is the new structure:
I like how the generated code put everything in the Querier interface, then checked that anything implementing it satisfied all of the function requirements. So I wanted to replicate that for the dynamic portion which I would be creating on my own.
It might be complicating it a bit more than it would 'NEED' to be, but it also provides an additional set of error checking that is nice to have. And in this case, even while maybe not necessary, it ended up being doable.
Would it be possible to wrap both what sqlc generates AND any queries that a developer creates (dynamic queries) into a single querier?
If I'm understanding your question correctly I think that you are looking for something like the below (playground):
package main
import (
"context"
"database/sql"
)
// Sample SQL C Code
type DBTX interface {
ExecContext(context.Context, string, ...interface{}) (sql.Result, error)
PrepareContext(context.Context, string) (*sql.Stmt, error)
QueryContext(context.Context, string, ...interface{}) (*sql.Rows, error)
QueryRowContext(context.Context, string, ...interface{}) *sql.Row
}
type Queries struct {
db DBTX
}
func (q *Queries) DeleteAccount(ctx context.Context, id int64) error {
// _, err := q.db.ExecContext(ctx, deleteAccount, id)
// return err
return nil // Pretend that this always works
}
type Querier interface {
DeleteAccount(ctx context.Context, id int64) error
}
//
// Your custom "dynamic" queries
//
type myDynamicQueries struct {
db DBTX
}
func (m *myDynamicQueries) GetDynamicResult(ctx context.Context) error {
// _, err := q.db.ExecContext(ctx, deleteAccount, id)
// return err
return nil // Pretend that this always works
}
type myDynamicQuerier interface {
GetDynamicResult(ctx context.Context) error
}
// Combine things
type allDatabase struct {
*Queries // Note: You could embed this directly into myDynamicQueries instead of having a seperate struct if that is your preference
*myDynamicQueries
}
type DatabaseFunctions interface {
Querier
myDynamicQuerier
}
func main() {
// Basic example
var db DatabaseFunctions
db = getDatabase()
db.DeleteAccount(context.Background(), 0)
db.GetDynamicResult(context.Background())
}
// getDatabase - Perform whatever is needed to connect to database...
func getDatabase() allDatabase {
sqlc := &Queries{db: nil} // In reality you would use New() to do this!
myDyn := &myDynamicQueries{db: nil} // Again it's often cleaner to use a function
return allDatabase{Queries: sqlc, myDynamicQueries: myDyn}
}
The above is all in one file for simplicity but could easily pull from multiple packages e.g.
type allDatabase struct {
*generatedcode.Queries
*store.myDynamicQueries
}
If this does not answer your question then please show one of your failed attempts (so we can see where you are going wrong).
One general comment - do you really need the interface? A common recommendation is "Accept interfaces, return structs". While this may not always apply I suspect you may be introducing interfaces where they are not really necessary and this may add unnecessary complexity.
I thought that the Store, which was housing both Queriers, was tying it all together. Can you explain a little with the example above (in the question post) why it's not necessary? How does SQLStore get access to all of the Querier interface functions?
The struct SQLStore is what is "tying it all together". As per the Go spec:
Given a struct type S and a named type T, promoted methods are included in the method set of the struct as follows:
If S contains an embedded field T, the method sets of S and *S both include promoted methods with receiver T. The method set of *S also includes promoted methods with receiver *T.
If S contains an embedded field *T, the method sets of S and *S both include promoted methods with receiver T or *T.
So an object of type SQLStore:
type SQLStore struct {
db *sql.DB
*sdb.Queries
*dynamicQueries
}
var foo SQLStore // Assume that we are actually providing values for all fields
Will implement all of the methods of sdb.Queries and, also, those in dynamicQueries (you can also access the sql.DB members via foo.db.XXX). This means that you can call foo.AddAccountBalance() and foo.MyGenericQuery() (assuming that is in dynamicQueries!) etc.
The spec says "In its most basic form an interface specifies a (possibly empty) list of methods". So you can think of an interface as a list of functions that must be implemented by whatever implementation (e.g. struct) you assign to the interface (the interface itself does not implement anything directly).
This example might help you understand.
Hopefully that helps a little (as I'm not sure which aspect you don't understand I'm not really sure what to focus on).

How do I improve the testability of go library methods

I'm writing some code that uses a library called Vault. In this library we have a Client. My code makes use of this Client but I want to be able to easily test the code that uses it. I use only a couple methods from the library so I ended up creating an interface:
type VaultClient interface {
Logical() *api.Logical
SetToken(v string)
NewLifetimeWatcher(i *api.LifetimeWatcherInput) (*api.LifetimeWatcher, error)
}
Now if my code is pointed at this interface everything is easily testable.. Except let's look at the Logical() method. It returns a struct here. My issue is that this Logical struct also has methods on it that allow you to Read, Write, ex:
func (c *Logical) Read(path string) (*Secret, error) {
return c.ReadWithData(path, nil)
}
and these are being used in my project as well to do something like:
{{ VaultClient defined above }}.Logical().Write("something", something)
Here is the issue. The Logical returned from the call to .Logical() has a .Write and .Read method that I can't reach to mock. I don't want all the logic within those methods to run in my tests.
Ideally I'd like to be able to do something similar to what I did above and create an interface for Logical as well. I'm relatively new to Golang, but I'm struggling with the best approach here. From what I can tell that's not possible. Embedding doesn't work like inheritance so it seems like I have to return a Logical. That leaves my code unable to be tested as simply as I would like because all the logic within a Logical's methods can't be mocked.
I'm sort of at a loss here. I have scoured Google for an answer to this but nobody ever talks about this scenario. They only go as far as I went with the initial interface for the client.
Is this a common scenario? Other libraries I've used don't return structs like Logical. Instead they typically just return a bland struct that holds data and has no methods.
package usecasevaultclient
// usecase.go
type VaultClient interface {
Logical() *api.Logical
SetToken(v string)
NewLifetimeWatcher(i *api.LifetimeWatcherInput) (*api.LifetimeWatcher, error)
}
type vaultClient struct {
repo RepoVaultClient
}
// create new injection
func NewVaultClient(repo RepoVaultClient) VaultClient {
return &vaultClient{repo}
}
func(u *vaultClient) Logical() *api.Logical {
// do your logic and call the repo of
u.repo.ReadData()
u.repo.WriteData()
}
func(u *vaultClient) SetToken(v string) {}
func(u *vaultClient) NewLifetimeWatcher(i *api.LifetimeWatcherInput) (*api.LifetimeWatcher, error)
// interfaces.go
type RepoVaultClient interface {
ReadData() error
WriteData() error
}
// repo_vaultclient_mock.go
import "github.com/stretchr/testify/mock"
type MockRepoVaultClient struct {
mock.Mock
}
func (m *MockRepoVaultClient) ReadData() error {
args := m.Called()
return args.Error(0)
}
func (m *MockRepoVaultClient) WriteData() error {
args := m.Called()
return args.Error(0)
}
// vaultClient_test.go
func TestLogicalShouldBeSuccess(t *testing.T) {
mockRepoVaultClient = &MockRepoVaultClient{}
useCase := NewVaultClient(mockRepoVaultClient)
mockRepoVaultClient.On("ReadData").Return(nil)
mockRepoVaultClient.On("WriteData").Return(nil)
// your logics gonna make this response as actual what u implemented
response := useCase.Logical()
assert.Equal(t, expected, response)
}
if you want to test the interface of Logical you need to mock the ReadData and WriteData , with testify/mock so u can defined the respond of return of those methods and you can compare it after you called the new injection of your interface

Dependency injection with http Handler in Go

I am trying to wrap my head around dependency injection in Go, but really stuck here. Here's a (drastically simplified) app which should serve as an example:
package main
import (
"net/http"
"github.com/gorilla/mux"
)
func main() {
mux := mux.NewRouter()
mux.Handle("/", myHandler()).Methods("GET")
http.ListenAndServe(":9000", mux)
}
type myObject interface {
Start()
}
type Object struct {
}
func (o *Object) Start() {
// Something wild here, for example sending out an email,
// query an external DB or something similar..
}
func myHandler() http.Handler {
// Inject myObject-like struct somewhere here?
o := Object{}
return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
o.Start()
w.Write([]byte("Started Object"))
})
}
I have a problem with testing the Object struct. What I usually do is create an interface which can be used in testing by setting up a test struct. For instance, if I had a DB connection handler, in testing I can create a mock which satisfies the handler interface and pass this to the "myHandler" call as a parmeter.
Unfortunately this only works if the struct is already instantiated when the "mux.Handle" call is made. I simply don't see any simple way to test the myHandler function with an Object struct which can be injected in tests, since it will be created after the handler gets called.
Any hints or ideas on how to get this done? Maybe I have to rethink my testing approach, but I would really like to unit-test the Object struct, but also test the http handler separately (as this handler may perform more tasks than just creating the Object).

Capturing net.Listener passed to http.Server.Serve

I'd like to extend the http.Server functionality by performing a graceful shutdown and some other gadgets that I would share across my HTTP services. Currently my code says more or less:
type MyServer struct {
server *http.Server
// ...
}
func (s *MyServer) ListenAndServe() {
// Create listener and pass to s.server.Serve()
}
This works great, but requires exposing all necessary methods and variables of http.Server manually.
Wrapping most of the methods wouldn't be a big problem, but I can't find a sensible way to expose access to http.Server.ListenAndServeTLS without actually copying implementation from the source. The last line in the method says srv.Serve(tlsListener) and I'd love to provide my own Serve method, so modification of net.Listener is possible before passing it to http.Server.Serve.
I started to pencil my wrapper by putting simply:
type MyServer struct {
http.Server
}
func (s *MyServer) Serve(l net.Listener) {
// Wrap l with MyListener, pass to s.Server.Serve()
}
but obviously neither http.ListenAndServe nor http.ListenAndServeTLS would start using my implementation of Serve. And I'd like to ask them to... Is there any way I can tackle the problem or does the design of http.Server effectively prevent me from solving this?
Hacks welcome: even if I don't use them in production, I'll gain some knowledge.
The http.ListenAndServe* methods will work on the embedded type. The other way around works:
type MyServer struct {
http.Server
// ...
}
func (s *MyServer) ListenAndServe() error {
// create listener
// s.Server.Serve(s.listener)
}
func (s *MyServer) ListenAndServeTLS() error {
// create listener
// s.Server.Serve(s.tlsListener)
}

golang import struct pointer

Ok i have a main package and a http handler package. Essentially what i am trying to do is setup a global struct so that way i can call upon information in that struct at any time.
Basic outline of my attempted example below:
Main package imports handler function
Main package calls handlerfunc
Handlerfunc sets http.ResponseWriter and other items into UrlInfo struct
Handlerfunc runs passed in function (without having to pass UrlStruct into function)
Run function (home in this example)
Function home can call upon variable uinfo at any time cause its a pointer UrlInfo struct
Obviously this doesnt work, but this is essentially what i would like to do so that way im not having to pass all this info into my home function. Keeping it clean and simple.
Any thoughts and ideas are welcome. Thanks.
Handler Package
package handler
// Struct containing http requests and variables
type UrlInfo struct {
Res http.ResponseWriter
Req *http.Request
Item string
}
func HandleFunc(handlepath string, runfunc func()) {
// Set handler and setup struct
http.HandleFunc(handlepath, func(w http.ResponseWriter, r *http.Request) {
url := new(UrlInfo)
url.Res = w
url.Req = r
url.Item = "Item information"
runfunc()
})
}
Main Package
import "handler"
var uinfo = &handler.UrlInfo{}
func init() {
handler.HandleFunc("/home/", home)
}
func home() {
fmt.Println(uinfo.Item)
}
From what I gather from your question, you are attempting to define a single, global instance of a structure which, among other things, holds a reference to the current Request and ResponseWriter.
If this is the intention, I should warn you this is going to cause problems.
Go's http package executes each request handler in a separate goroutine. This means that you can have arbitrarily many requests being handled simultaneously. Therefore they can not all refer to the same global structure safely and expect it to contain request information relevant only to that particular request. The global instance should not be used if you expect your server to be thread safe.
Keeping the code clean by grouping extraneous parameters in a structure can be handy, but in your case, I do not believe you can (or should) avoid passing a new instance of UrlInfo structure directly to home() as a parameter. It will make things unnecessarily complex and unpredictable.

Resources