How can I create a map from all of the functions found in another file in Go? It's pretty easy to get the names of them using the following code, but I haven't found a way to get the actual functions, or for the desired code to work.
fileSet := token.NewFileSet()
ast, err := parser.ParseFile(fileSet, "funcFile.go", nil, 0)
if err != nil {
panic(err)
}
for key := range ast.Scope.Objects {
fmt.Println(key) //
}
Desired result:
funcFile.go
imports ...
func returnTrue () bool {return true}
mapFile.go
func CreateFuncMap () {
fns := make(map[string]func(bars []Types.MarketData) bool)
otherFuncs := getFuncsFromFile("funcFile.go")
for _, fn := range otherFuncs {
fns[fn.Name] = fn
}
}
Related
I know this may seem like bad design (and I wish I didn't need to do this), but I need to read a struct that is generated automatically at run time and create a new instance of it.
The struct that I create I only need limited metadata of so that it can be passed to the Gorm Gen FieldRelateModel method (from here) where 'relModel' is the new instance I am hoping to make (see below):
FieldRelateModel = func(relationship field.RelationshipType, fieldName string, relModel interface{}, config *field.RelateConfig) model.CreateFieldOpt {
st := reflect.TypeOf(relModel)
if st.Kind() == reflect.Ptr {
st = st.Elem()
}
fieldType := st.String()
if config == nil {
config = &field.RelateConfig{}
}
if config.JSONTag == "" {
config.JSONTag = ns.ColumnName("", fieldName)
}
return func(*model.Field) *model.Field {
return &model.Field{
Name: fieldName,
Type: config.RelateFieldPrefix(relationship) + fieldType,
JSONTag: config.JSONTag,
GORMTag: config.GORMTag,
NewTag: config.NewTag,
OverwriteTag: config.OverwriteTag,
Relation: field.NewRelationWithModel(relationship, fieldName, fieldType, relModel),
}
}
}
I am able to parse the struct and get its metadata using ast and reflect (see below), but I don't know how I can do the last step so that the 'structInstance' I return is valid when passed as the 'relModel' to the 'FieldRelateModel' function/option.
func StructExtract(filePath string) (any, error) {
file, err := os.Open(filePath)
if err != nil {
fmt.Println(err)
return nil, err
}
defer file.Close()
fset := token.NewFileSet()
f, err := parser.ParseFile(fset, filePath, file, parser.AllErrors)
if err != nil {
fmt.Println(err)
return nil, err
}
var structType *ast.StructType
// Iterate over the top-level declarations in the file
for _, decl := range f.Decls {
// Check if the declaration is a GenDecl (which can contain structs)
genDecl, ok := decl.(*ast.GenDecl)
if !ok {
continue
}
// Iterate over the GenDecl's specs
for _, spec := range genDecl.Specs {
// Check if the spec is a TypeSpec (which can contain structs)
typeSpec, ok := spec.(*ast.TypeSpec)
if !ok {
continue
}
// Check if the TypeSpec's type is a struct
structType, ok = typeSpec.Type.(*ast.StructType)
if !ok {
continue
}
break
}
}
if structType == nil {
fmt.Println("No struct found in file.")
return nil, err
}
structInstance := reflect.New(reflect.TypeOf(structType)).Elem().Interface()
return structInstance, nil
}
Here is a go playground that I thought might make it easier if someone were to help, however, due to the nature of the problem it won't be able to run within the playground (for various reasons).
I have implemented a very simple Decode method (using gob.Decoder for now) - this works well for single responses - it would even work well for slices, but I need to implement a DecodeMany method where it is able to decode a set of individual responses (not a slice).
Working Decode method:
var v MyType
_ = Decode(&v)
...
func Decode(v interface{}) error {
buf, _ := DoSomething() // func DoSomething() ([]byte, error)
// error handling omitted for brevity
return gob.NewDecoder(bytes.NewReader(buf)).Decode(v)
}
What I'm trying to do for a DecodeMany method is to deal with a response that isn't necessarily a slice:
var vv []MyType
_ = DecodeMany(&vv)
...
func DecodeMany(vv []interface{}) error {
for _, g := range DoSomething() { // func DoSomething() []struct{Buf []bytes}
// Use g.Buf as an individual "interface{}"
// want something like:
var v interface{} /* Somehow create instance of single vv type? */
_ = gob.NewDecoder(bytes.NewReader(g.Buf)).Decode(v)
vv = append(vv, v)
}
return
}
Besides not compiling the above also has the error of:
cannot use &vv (value of type *[]MyType) as type []interface{} in argument to DecodeMany
If you want to modify the passed slice, it must be a pointer, else you must return a new slice. Also if the function is declared to have a param of type []interface{}, you can only pass a value of type []interface{} and no other slice types... Unless you use generics...
This is a perfect example to start using generics introduced in Go 1.18.
Change DecodeMany() to be generic, having a T type parameter being the slice element type:
When taking a pointer
func DecodeMany[T any](vv *[]T) error {
for _, g := range DoSomething() {
var v T
if err := gob.NewDecoder(bytes.NewReader(g.Buf)).Decode(&v); err != nil {
return err
}
*vv = append(*vv, v)
}
return nil
}
Here's a simple app to test it:
type MyType struct {
S int64
}
func main() {
var vv []MyType
if err := DecodeMany(&vv); err != nil {
panic(err)
}
fmt.Println(vv)
}
func DoSomething() (result []struct{ Buf []byte }) {
for i := 3; i < 6; i++ {
buf := &bytes.Buffer{}
v := MyType{S: int64(i)}
if err := gob.NewEncoder(buf).Encode(v); err != nil {
panic(err)
}
result = append(result, struct{ Buf []byte }{buf.Bytes()})
}
return
}
This outputs (try it on the Go Playground):
[{3} {4} {5}]
When returning a slice
If you choose to return the slice, you don't have to pass anything, but you need to assign the result:
func DecodeMany[T any]() ([]T, error) {
var result []T
for _, g := range DoSomething() {
var v T
if err := gob.NewDecoder(bytes.NewReader(g.Buf)).Decode(&v); err != nil {
return result, err
}
result = append(result, v)
}
return result, nil
}
Using it:
vv, err := DecodeMany[MyType]()
if err != nil {
panic(err)
}
fmt.Println(vv)
Try this one on the Go Playground.
I have a function that is called from an other package. This function parses a text string received in the argument and returns a splice of structs. These structs are, in my case, 7 and all differ in some fields. I'm trying to return an interface{} type but I'd like to be able to assert the type on the receiving end as I need to do other operations on that struct.
So far I've got to this point (the function are meant to be in different packages):
func Process(input string) interface{} {
// ...
switch model {
case 1:
output := model1.Parse(input) // this function returns a []MOD1 type
return output
}
case 2:
output := model2.Parse(input) // this function returns a []MOD2 type
return output
}
// other cases with other models
}
func main() {
// ...
output := package.Process(input) // now output is of type interface{}
// I need the splice because I'll publish each element to a PubSub topic
for _, doc := range output {
obj, err := json.Marshal(doc)
if err != nil {
// err handling
}
err = publisher.Push(obj)
if err != nil {
// err handling
}
}
}
Now in main() I'd like output to be of type []MOD1, []MOD2, ..., or []MOD7.
I've tried the to use the switch t := output.(type) but t exists only in the scope of the switch case and it doesn't really solve my problem.
func main() {
output := package.Process(input)
switch t := output.(type) {
case []MOD1:
output = t // output is still of type interface{}
case []MOD2:
output = t
}
}
I need this because in the main func I have to do other operations on that structure and eventually Marshall it into a JSON. I've thought of marshalling the structs in the Process func and return the []byte but then I'd need to unmarshall in main without knowing the struct's type.
Change Process1 to return a []interface{}:
func Process(input string) []interface{} {
// ...
switch model {
case 1:
output := model1.Parse(input)
result := make([]interface{}, len(output))
for i := range result {
result[i] = output[i]
}
return result
}
case 2:
// other cases with other models
}
Work with the []interface{} in main:
output := package.Process(input)
for _, doc := range output {
obj, err := json.Marshal(doc)
if err != nil {
// err handling
}
err = publisher.Push(obj)
if err != nil {
// err handling
}
}
Assuming we have two yaml files
master.yaml
someProperty: "someVaue"
anotherProperty: "anotherValue"
override.yaml
someProperty: "overriddenVaue"
Is it possible to unmarshall, merge, and then write those changes to a file without having to define a struct for every property in the yaml file?
The master file has over 500 properties in it that are not at all important to the service at this point of execution, so ideally I'd be able to just unmarshal into a map, do a merge and write out in yaml again but I'm relatively new to go so wanted some opinions.
I've got some code to read the yaml into an interface but i'm unsure on the best approach to then merge the two.
var masterYaml interface{}
yamlBytes, _ := ioutil.ReadFile("master.yaml")
yaml.Unmarshal(yamlBytes, &masterYaml)
var overrideYaml interface{}
yamlBytes, _ = ioutil.ReadFile("override.yaml")
yaml.Unmarshal(yamlBytes, &overrideYaml)
I've looked into libraries like mergo but i'm not sure if that's the right approach.
I'm hoping that after the master I would be able to write out to file with properties
someProperty: "overriddenVaue"
anotherProperty: "anotherValue"
Assuming that you just want to merge at the top level, you can unmarshal into maps of type map[string]interface{}, as follows:
package main
import (
"io/ioutil"
"gopkg.in/yaml.v2"
)
func main() {
var master map[string]interface{}
bs, err := ioutil.ReadFile("master.yaml")
if err != nil {
panic(err)
}
if err := yaml.Unmarshal(bs, &master); err != nil {
panic(err)
}
var override map[string]interface{}
bs, err = ioutil.ReadFile("override.yaml")
if err != nil {
panic(err)
}
if err := yaml.Unmarshal(bs, &override); err != nil {
panic(err)
}
for k, v := range override {
master[k] = v
}
bs, err = yaml.Marshal(master)
if err != nil {
panic(err)
}
if err := ioutil.WriteFile("merged.yaml", bs, 0644); err != nil {
panic(err)
}
}
For a broader solution (with n input files), you can use this function. I have used #robox answer to do my solution:
func ReadValues(filenames ...string) (string, error) {
if len(filenames) <= 0 {
return "", errors.New("You must provide at least one filename for reading Values")
}
var resultValues map[string]interface{}
for _, filename := range filenames {
var override map[string]interface{}
bs, err := ioutil.ReadFile(filename)
if err != nil {
log.Info(err)
continue
}
if err := yaml.Unmarshal(bs, &override); err != nil {
log.Info(err)
continue
}
//check if is nil. This will only happen for the first filename
if resultValues == nil {
resultValues = override
} else {
for k, v := range override {
resultValues[k] = v
}
}
}
bs, err := yaml.Marshal(resultValues)
if err != nil {
log.Info(err)
return "", err
}
return string(bs), nil
}
So for this example you should call it with this order:
result, _ := ReadValues("master.yaml", "overwrite.yaml")
In the case you have an extra file newFile.yaml, you could also use this function:
result, _ := ReadValues("master.yaml", "overwrite.yaml", "newFile.yaml")
DEEP MERGE TWO YAML FILES
package main
import (
"fmt"
"io/ioutil"
"sigs.k8s.io/yaml"
)
func main() {
// declare two map to hold the yaml content
base := map[string]interface{}{}
currentMap := map[string]interface{}{}
// read one yaml file
data, _ := ioutil.ReadFile("conf.yaml")
if err := yaml.Unmarshal(data, &base); err != nil {
}
// read another yaml file
data1, _ := ioutil.ReadFile("conf1.yaml")
if err := yaml.Unmarshal(data1, ¤tMap); err != nil {
}
// merge both yaml data recursively
base = mergeMaps(base, currentMap)
// print merged map
fmt.Println(base)
}
func mergeMaps(a, b map[string]interface{}) map[string]interface{} {
out := make(map[string]interface{}, len(a))
for k, v := range a {
out[k] = v
}
for k, v := range b {
if v, ok := v.(map[string]interface{}); ok {
if bv, ok := out[k]; ok {
if bv, ok := bv.(map[string]interface{}); ok {
out[k] = mergeMaps(bv, v)
continue
}
}
}
out[k] = v
}
return out
}
I'm trying to list all function call in a function using ast. But having trouble understanding how it is suppose to be used. I have been able to get this far.
set := token.NewFileSet()
packs, err := parser.ParseFile(set, serviceFile, nil, 0)
if err != nil {
fmt.Println("Failed to parse package:", err)
os.Exit(1)
}
funcs := []*ast.FuncDecl{}
for _, d := range packs.Decls {
if fn, isFn := d.(*ast.FuncDecl); isFn {
funcs = append(funcs, fn)
}
}
I have inspected funcs. I get to funcs[n1].Body.List[n2].
But after this i don't understand how i'm suppose to read the underlaying data.X.Fun.data.Sel.name (got it from evaluation in gogland) to get name of the function being called.
you can use ast.Inspect for that and use a switch on the type of the node.
for _, fun := range funcs {
ast.Inspect(fun, func(node ast.Node) bool {
switch n := node.(type) {
case *ast.CallExpr:
fmt.Println(n) // prints every func call expression
}
return true
})
}
Ok so what i found is that you have to a lot of casting to actually extract the data.
Here is an example on how to do extract the func call in a func.
for _, function := range funcs {
extractFuncCallInFunc(function.Body.List)
}
func extractFuncCallInFunc(stmts []ast.Stmt) {
for _, stmt := range funcs {
if exprStmt, ok := stmt.(*ast.ExprStmt); ok {
if call, ok := exprStmt.X.(*ast.CallExpr); ok {
if fun, ok := call.Fun.(*ast.SelectorExpr); ok {
funcName := fun.Sel.Name
}
}
}
}
}
I also found this with kind of help with finding out what you need to cast it to.
http://goast.yuroyoro.net/