Is it possible to have my function definition below accept any type of struct?
I've tried to refactor like so:
// This method should accept any type of struct
// Once I receive my response from the database,
// I scan the rows to create a slice of type struct.
func generateResponse(rows *sqlx.Rows, structSlice []struct{}, structBody struct{}) ([]struct{}, error) {
for rows.Next() {
err := rows.StructScan(&structBody)
if err != nil {
return nil, err
}
structSlice = append(structSlice, structBody)
}
err := rows.Err()
if err != nil {
return nil, err
}
return structSlice, nil
}
Assume my struct is of type OrderRevenue.
When I call the function above:
structSlice, err := generateResponse(rows, []OrderRevenue{}, OrderRevenue{})
The error I get is:
cannot use []OrderRevenue literal as type []struct{} in argument...
Am I going about this the wrong way?
This is considered the cornerstone (or more of a limitation) of Go's type system. struct{} is an unnamed type that is different from struct{ field1 int } and of course is not the same as OrderRevenue{}.
Go emphasizes abstraction through interfaces, and perhaps you should try that. Here is the first take:
type OrderRevenue interface {
MarshalMyself() ([]byte, error)
}
type Anonymous struct {}
func (a Anonymous) MarshalMyself() ([]byte, error) {
// implementation's up to you
return []byte{}, nil
}
// the function signature
generateResponse(rows *sqlx.Rows, structSlice []OrderRevenue, structBody Body) ([]Body, error) {
// ...
}
In this case you can also use empty interface interface{}, which all types implement, but you'll have to recursively go through the structure to do manual type assertion. The best approach in Go is to know the shape of your data in advance, at least partially.
Related
package main
import (
"encoding/json"
"fmt"
"log"
"strings"
)
type Animal int
const (
Unknown Animal = iota
Gopher
Zebra
)
func (a *Animal) UnmarshalJSON(b []byte) error {
var s string
if err := json.Unmarshal(b, &s); err != nil {
return err
}
switch strings.ToLower(s) {
default:
*a = Unknown
case "gopher":
*a = Gopher
case "zebra":
*a = Zebra
}
return nil
}
func (a Animal) MarshalJSON() ([]byte, error) {
var s string
switch a {
default:
s = "unknown"
case Gopher:
s = "gopher"
case Zebra:
s = "zebra"
}
return json.Marshal(s)
}
func main() {
blob := `["gopher","armadillo","zebra","unknown","gopher","bee","gopher","zebra"]`
var zoo []Animal
if err := json.Unmarshal([]byte(blob), &zoo); err != nil {
log.Fatal(err)
}
census := make(map[Animal]int)
for _, animal := range zoo {
census[animal] += 1
}
fmt.Printf("Zoo Census:\n* Gophers: %d\n* Zebras: %d\n* Unknown: %d\n",
census[Gopher], census[Zebra], census[Unknown])
}
This is the code snippet of Json custom marshal example in go doc. My question is where is the call to MarshalJSON and UnmarshalJSON method in this code. Are these method somehow overriding Json package's UnmarshalJSON and MarshalJSON method. I thought go does not support method overriding this way. Pls help, i am not able to understand what is happening in this code!!
The documentation says:
To unmarshal JSON into a value implementing the Unmarshaler interface, Unmarshal calls that value's UnmarshalJSON method, including when the input is a JSON null.
Somewhere in the json.Unmarshal implementation, there's code similar this:
u, ok := v.(Unmarshaler)
if ok {
err := u.Unmarshal(data)
if err != nil { /* handle error */}
} else {
// handle other kinds of values
}
The code uses a type assertion to determine if the value satisfies the json.Unmarshaler interface. If the value does satisfy the method, the value's UnmarshalJSON function is called.
The (*Animal).UnmarshalJSON function is called because *Animal satisfies the json.Unmarshaler interface.
This is an example of implementing an interface from a different package.
There's no need to explicitly declare that you're implementing an interface in Go, like there is in Java or C++ for example. You just have to implement all the functions it declares. In this case, you're implementing the Unmarshaler interface declared in the json package which is used by the Unmarshal function.
Just started learning generics. I'm making a command processor and I honestly don't know how to word this so I'm just going to show an example problem:
var ErrInvalidCommand = errors.New("invalid command")
type TransactionalFn[T any] func(ctx context.Context, db T) error
func NewTransactionalCommand[T any](fn TransactionalFn[T]) *TransactionalCommand[T] {
return &TransactionalCommand[T]{
fn: fn,
}
}
type TransactionalCommand[T any] struct {
fn TransactionalFn[T]
}
func (cmd *TransactionalCommand[T]) StartTransaction() error {
return nil
}
func (cmd *TransactionalCommand[T]) Commit() error {
return nil
}
func (cmd *TransactionalCommand[T]) Rollback() error {
return nil
}
type CMD interface{}
type CommandManager struct{}
func (m *CommandManager) Handle(ctx context.Context, cmd CMD) error {
switch t := cmd.(type) {
case *TransactionalCommand[any]:
return m.handleTransactionalCommand(ctx, t)
default:
fmt.Printf("%T\n", cmd)
return ErrInvalidCommand
}
}
func (m *CommandManager) handleTransactionalCommand(ctx context.Context, cmd *TransactionalCommand[any]) error {
if err := cmd.StartTransaction(); err != nil {
return err
}
if err := cmd.fn(ctx, nil); err != nil {
if err := cmd.Rollback(); err != nil {
return err
}
}
if err := cmd.Commit(); err != nil {
return err
}
return nil
}
// tests
type db struct{}
func (*db) Do() {
fmt.Println("doing stuff")
}
func TestCMD(t *testing.T) {
ctx := context.Background()
fn := func(ctx context.Context, db *db) error {
fmt.Println("test cmd")
db.Do()
return nil
}
tFn := bus.NewTransactionalCommand(fn)
mng := &bus.CommandManager{}
err := mng.Handle(ctx, tFn)
if err != nil {
t.Fatal(err)
}
}
mng.handle returns ErrInvalidCommand so the test fails because cmd is *TransactionalCommand[*db] and not *TransactionalCommand[any]
Let me give another, more abstract example:
type A[T any] struct{}
func (*A[T]) DoA() { fmt.Println("do A") }
type B[T any] struct{}
func (*B[T]) DoB() { fmt.Println("do B") }
func Handle(s interface{}) {
switch x := s.(type) {
case *A[any]:
x.DoA()
case *B[any]:
x.DoB()
default:
fmt.Printf("%T\n", s)
}
}
func TestFuncSwitch(t *testing.T) {
i := &A[int]{}
Handle(i) // expected to print "do A"
}
Why doesn't this switch statement case *A[any] match *A[int]?
How to make CommandManager.Handle(...) accept generic Commands?
Why does the generic type switch fail to compile?
This is in fact the result of an intentional decision of the Go team. It turned out that allowing type switches on parametrized types can cause confusion
In an earlier version of this design, we permitted using type assertions and type switches on variables whose type was a type parameter, or whose type was based on a type parameter. We removed this facility because it is always possible to convert a value of any type to the empty interface type, and then use a type assertion or type switch on that. Also, it was sometimes confusing that in a constraint with a type set that uses approximation elements, a type assertion or type switch would use the actual type argument, not the underlying type of the type argument (the difference is explained in the section on identifying the matched predeclared type)
From the Type Parameters Proposal
Let me turn the emphasized statement into code. If the type constraint uses type approximation (note the tildes)...
func PrintStringOrInt[T ~string | ~int](v T)
...and if there also was a custom type with int as the underlying type...
type Seconds int
...and if PrintOrString() is called with a Seconds parameter...
PrintStringOrInt(Seconds(42))
...then the switch block would not enter the int case but go right into the default case, because Seconds is not an int. Developers might expect that case int: matches the type Seconds as well.
To allow a case statement to match both Seconds and int would require a new syntax, like, for example,
case ~int:
As of this writing, the discussion is still open, and maybe it will result in an entirely new option for switching on a type parameter (such as, switch type T).
More details, please refer to proposal: spec: generics: type switch on parametric types
Trick: convert the type into 'any'
Luckily, we do not need to wait for this proposal to get implemented in a future release. There is a super simple workaround available right now.
Instead of switching on v.(type), switch on any(v).(type).
switch any(v).(type) {
...
This trick converts v into an empty interface{} (a.k.a. any), for which the switch happily does the type matching.
Source: A tip and a trick when working with generics
*A[any] does not match *A[int] because any is a static type, not a wildcard. Therefore instantiating a generic struct with different types yields different types.
In order to correctly match a generic struct in a type switch, you must instantiate it with a type parameter:
func Handle[T any](s interface{}) {
switch x := s.(type) {
case *A[T]:
x.DoA()
case *B[T]:
x.DoB()
default:
panic("no match")
}
}
Though in absence of other function arguments to infer T, you will have to call Handle with explicit instantiation. T won't be inferred from the struct alone.
func main() {
i := &A[int]{}
Handle[int](i) // expected to print "do A"
}
Playground: https://go.dev/play/p/2e5E9LSWPmk
However when Handle is actually a method, as in your database code, this has the drawback of choosing the type parameter when instantiating the receiver.
In order to improve the code here you can make Handle a top-level function:
func Handle[T any](ctx context.Context, cmd CMD) error {
switch t := cmd.(type) {
case *TransactionalCommand[T]:
return handleTransactionalCommand(ctx, t)
default:
fmt.Printf("%T\n", cmd)
return ErrInvalidCommand
}
}
Then you have the problem of how to supply the argument db T to the command function. For this, you might:
simply pass an additional *db argument to Handle and handleTransactionalCommand, which also helps with type parameter inference. Call as Handle(ctx, &db{}, tFn). Playground: https://go.dev/play/p/6WESb86KN5D
pass an instance of CommandManager (like solution above but *db is wrapped). Much more verbose, as it requires explicit instantiation everywhere. Playground: https://go.dev/play/p/SpXczsUM5aW
use a parametrized interface instead (like below). So you don't even have to type-switch. Playground: https://go.dev/play/p/EgULEIL6AV5
type CMD[T any] interface {
Exec(ctx context.Context, db T) error
}
I'm trying to build a generic function which will parse input (in JSON) into a specified structure. The structure may vary at run-time, based on parameters which are passed to the function. I'm currently trying to achieve this by passing an object of the right type and using reflect.New() to create a new output object of the same type.
I'm then parsing the JSON into this object, and scanning the fields.
If I create the object and specify the type in code, everything works. If I pass an object and try to create a replica, I get an "invalid indirect" error a few steps down (see code).
import (
"fmt"
"reflect"
"encoding/json"
"strings"
)
type Test struct {
FirstName *string `json:"FirstName"`
LastName *string `json:"LastName"`
}
func genericParser(incomingData *strings.Reader, inputStructure interface{}) (interface{}, error) {
//******* Use the line below and things work *******
//parsedInput := new(Test)
//******* Use vvv the line below and things don't work *******
parsedInput := reflect.New(reflect.TypeOf(inputStructure))
decoder := json.NewDecoder(incomingData)
err := decoder.Decode(&parsedInput)
if err != nil {
//parsing error
return nil, err
}
//******* This is the line that generates the error "invalid indirect of parsedInput (type reflect.Value)" *******
contentValues := reflect.ValueOf(*parsedInput)
for i := 0; i < contentValues.NumField(); i++ {
//do stuff with each field
fmt.Printf("Field name was: %s\n", reflect.TypeOf(parsedInput).Elem().Field(i).Name)
}
return parsedInput, nil
}
func main() {
inputData := strings.NewReader("{\"FirstName\":\"John\", \"LastName\":\"Smith\"}")
exampleObject := new(Test)
processedData, err := genericParser(inputData, exampleObject)
if err != nil {
fmt.Println("Parsing error")
} else {
fmt.Printf("Success: %v", processedData)
}
}
If I can't create a replica of the object, then a way of updating / returning the one supplied would be feasible. The key thing is that this function must be completely agnostic to the different structures available.
reflect.New isn't a direct analog to new, as it can't return a specific type, it only can return a reflect.Value. This means that you are attempting to unmarshal into a *reflect.Value, which obviously isn't going to work (even if it did, your code would have passed in **Type, which isn't what you want either).
Use parsedInput.Interface() to get the underlying value after creating the new value to unmarshal into. You then don't need to reflect on the same value a second time, as that would be a reflect.Value of a reflect.Value, which again isn't going to do anything useful.
Finally, you need to use parsedInput.Interface() before you return, otherwise you are returning the reflect.Value rather than the value of the input type.
For example:
func genericParser(incomingData io.Reader, inputStructure interface{}) (interface{}, error) {
parsedInput := reflect.New(reflect.TypeOf(inputStructure).Elem())
decoder := json.NewDecoder(incomingData)
err := decoder.Decode(parsedInput.Interface())
if err != nil {
return nil, err
}
for i := 0; i < parsedInput.Elem().NumField(); i++ {
fmt.Printf("Field name was: %s\n", parsedInput.Type().Elem().Field(i).Name)
}
return parsedInput.Interface(), nil
}
https://play.golang.org/p/CzDrj6sgQNt
I'm developing a tool that can be implemented to simplify the process of creating simple CRUD operations/endpoints. Since my endpoints don't know what kind of struct they'll be receiving, I've created an interface that users can implement, and return an empty object to be filled.
type ItemFactory interface {
GenerateEmptyItem() interface{}
}
And the users would implement something like:
type Test struct {
TestString string `json:"testString"`
TestInt int `json:"testInt"`
TestBool bool `json:"testBool"`
}
func (t Test) GenerateEmptyItem() interface{} {
return Test{}
}
When the Test object gets created, its type is "Test", even though the func returned an interface{}. However, as soon as I try to unmarshal some json of the same format into it, it strips it of its type, and becomes of type "map[string]interface {}".
item := h.ItemFactory.GenerateEmptyItem()
//Prints "Test"
fmt.Printf("%T\n", item)
fmt.Println(reflect.TypeOf(item))
err := ConvertRequestBodyIntoObject(r, &item)
if err != nil {...}
//Prints "map[string]interface {}"
fmt.Printf("%T\n", item)
Func that unmarshalls item:
func ConvertRequestBodyIntoObject(request *http.Request, object interface{}) error {
body, err := ioutil.ReadAll(request.Body)
if err != nil {
return err
}
// Unmarshal body into request object
err = json.Unmarshal(body, object)
if err != nil {
return err
}
return nil
}
Any suggestions as to why this happens, or how I can work around it?
Thanks
Your question lacks an example showing this behavior so I'm just guessing this is what is happening.
func Generate() interface{} {
return Test{}
}
func GeneratePointer() interface{} {
return &Test{}
}
func main() {
vi := Generate()
json.Unmarshal([]byte(`{}`), &vi)
fmt.Printf("Generate Type: %T\n", vi)
vp := GeneratePointer()
json.Unmarshal([]byte(`{}`), vp)
fmt.Printf("GenerateP Type: %T\n", vp)
}
Which outputs:
Generate Type: map[string]interface {}
GenerateP Type: *main.Test
I suggest you return a pointer in GenerateEmptyItem() instead of the actual struct value which is demonstrated in the GenerateP() example.
Playground Example
In order to determine whether a given type implements an interface using the reflect package, you need to pass a reflect.Type to reflect.Type.Implements(). How do you get one of those types?
As an example, trying to get the type of an uninitialized error (interface) type does not work (it panics when you to call Kind() on it)
var err error
fmt.Printf("%#v\n", reflect.TypeOf(err).Kind())
Do it like this:
var err error
t := reflect.TypeOf(&err).Elem()
Or in one line:
t := reflect.TypeOf((*error)(nil)).Elem()
Even Shaws response is correct, but brief. Some more details from the reflect.TypeOf method documentation:
// As interface types are only used for static typing, a common idiom to find
// the reflection Type for an interface type Foo is to use a *Foo value.
writerType := reflect.TypeOf((*io.Writer)(nil)).Elem()
fileType := reflect.TypeOf((*os.File)(nil)).Elem()
fmt.Println(fileType.Implements(writerType))
For googlers out there I just ran into the dreaded scannable dest type interface {} with >1 columns (XX) in result error.
Evan Shaw's answer did not work for me. Here is how I solved it. I am also using the lann/squirrel library, but you could easily take that out.
The solution really isn't that complicated, just knowing the magic combination of reflect calls to make.
The me.GetSqlx() function just returns an instance to *sqlx.DB
func (me *CommonRepo) Get(query sq.SelectBuilder, dest interface{}) error {
sqlst, args, err := query.ToSql()
if err != nil {
return err
}
// Do some reflection magic so that Sqlx doesn't hork on interface{}
v := reflect.ValueOf(dest)
return me.GetSqlx().Get(v.Interface(), sqlst, args...)
}
func (me *CommonRepo) Select(query sq.SelectBuilder, dest interface{}) error {
sqlst, args, err := query.ToSql()
if err != nil {
return err
}
// Do some reflection magic so that Sqlx doesn't hork on interface{}
v := reflect.ValueOf(dest)
return me.GetSqlx().Select(v.Interface(), sqlst, args...)
}
Then to invoke it you can do:
func (me *myCustomerRepo) Get(query sq.SelectBuilder) (rec Customer, err error) {
err = me.CommonRepo.Get(query, &rec)
return
}
func (me *myCustomerRepo) Select(query sq.SelectBuilder) (recs []Customer, err error) {
err = me.CommonRepo.Select(query, &recs)
return
}
This allows you to have strong types all over but have all the common logic in one place (CommonRepo in this example).