Does testing package support snapshot testing?
Here is my case:
package main
import (
"bytes"
"fmt"
"html/template"
)
func main() {
query := `
INSERT INTO "ADGROUP_PERFORMANCE_REPORT" (
{{.columnPrefix}}_adgroup_id,
{{.columnPrefix}}_adgroup_nme,
{{.columnPrefix}}_adgroup_status,
{{.columnPrefix}}_campaign_id,
{{.columnPrefix}}_campaign_nme,
{{.columnPrefix}}_campaign_status,
{{.columnPrefix}}_clicks,
{{.columnPrefix}}_impressions,
{{.columnPrefix}}_ctr,
{{.columnPrefix}}_average_cpc,
{{.columnPrefix}}_cost,
{{.columnPrefix}}_conversions,
{{.columnPrefix}}_average_position,
{{.columnPrefix}}_device,
google_adwords_client_customer_id
) VALUES
`
vars := make(map[string]interface{})
vars["columnPrefix"] = "adgroup_performance_report"
result := processString(query, vars)
fmt.Printf("result=%s\n", result)
}
func process(t *template.Template, vars interface{}) string {
var tmplBytes bytes.Buffer
err := t.Execute(&tmplBytes, vars)
if err != nil {
panic(err)
}
return tmplBytes.String()
}
func processString(str string, vars interface{}) string {
tmpl, err := template.New("tmpl").Parse(str)
if err != nil {
panic(err)
}
return process(tmpl, vars)
}
Now I am going to write unit test for it, I would like use snapshot testing to test the structure of the SQL query string processed by html/template pkg.
Here is the output in the stdout:
result=
INSERT INTO "ADGROUP_PERFORMANCE_REPORT" (
adgroup_performance_report_adgroup_id,
adgroup_performance_report_adgroup_nme,
adgroup_performance_report_adgroup_status,
adgroup_performance_report_campaign_id,
adgroup_performance_report_campaign_nme,
adgroup_performance_report_campaign_status,
adgroup_performance_report_clicks,
adgroup_performance_report_impressions,
adgroup_performance_report_ctr,
adgroup_performance_report_average_cpc,
adgroup_performance_report_cost,
adgroup_performance_report_conversions,
adgroup_performance_report_average_position,
adgroup_performance_report_device,
google_adwords_client_customer_id
) VALUES
I don't want to write this expected value duplicately in unit test file and assert it. I prefer using snapshot testing and it will generate a snapshot file. Something like jestjs snapshot-testing
As far as I know, the testing Package does not support something like this out of the box. There is a pattern for Go that you can utilise called "Golden file testing". The convention is to store testdata in a testdata folder alongside your test. In this case you would store the rendered template in a so called "golden file". The test itself provides an update flag to write out the latest version (so that you don't have to manually maintain the output):
var update = flag.Bool("update", false, "update .golden files")
func TestProcessString(t *testing.T) {
vars := make(map[string]interface{})
vars["columnPrefix"] = "adgroup_performance_report"
actual := processString(query, vars)
golden := filepath.Join(“testdata”, ”performance_report.golden”)
if *update {
ioutil.WriteFile(golden, actual, 0644)
}
expected, _ := ioutil.ReadFile(golden)
if !bytes.Equal(actual, expected) {
t.Fatalf("Output did not match, expected %v, recieved %v, expected, actual)
}
}
A nice example of this pattern can be found in the gofmt source code: https://golang.org/src/cmd/gofmt/gofmt_test.go
This is probably too late as an answer, but I have been working on a snapshot testing library for Golang that is "like" jest toMatchSnapshot.
I believe it matches exactly your use case, how to test long strings that you don't want to keep that string on or unit tests.
It would be as simple as and also provides a nice diff view
func TestExample(t *testing.T) {
snaps.MatchSnapshot(t ,processString(query, vars))
}
You can have a look go-snaps.
Related
Is there a way to check if a public function/struct is used outside of the package in which it's declared? I'm not writing a public go module that's consumed anywhere else, and simply want to scan whether func Foo() it's used anywhere in my codebase outside of the package in which it's declared.
I'm using GoLand but any programmatic solution would do.
Simplest solution: manually rename Foo() to Foo2(). Build/compile your project: if there are no compilation errors, it's not referenced in your code. Same check also works with any identifiers and with any IDEs (this doesn't use any of the IDE's features).
Obviously if you already have a Foo2 identifier, this will fail. But the idea is to rename it to a non-existing identifier...
You can scan a particular package to see all the available function in it.
In this main.go, app the root package name and there is another package in database directory under the package name database.
By running the code you will found all the function name available inside database package
package main
import (
"fmt"
"app/database"
"go/ast"
"go/parser"
"go/token"
"os"
)
// Name of the package you want to scan
const subPackage = "database"
func main() {
set := token.NewFileSet()
packs, err := parser.ParseDir(set, subPackage, nil, 0)
if err != nil {
fmt.Println("Failed to parse package:", err)
os.Exit(1)
}
funcs := []*ast.FuncDecl{}
for _, pack := range packs {
for _, f := range pack.Files {
for _, d := range f.Decls {
if fn, isFn := d.(*ast.FuncDecl); isFn {
funcs = append(funcs, fn)
}
}
}
}
fmt.Println("All the functions in the package:",subPackage)
for _, fn := range funcs {
fmt.Println(fn.Name.Name)
}
// database Package is called/used
database.Connection()
}
This will get all function declarations in the stated subpackage as an ast.FuncDecl. This isn't an invokable function; it's just a representation of its source code of it.
If you wanted to do anything like call these functions, you'd have to do something more sophisticated. After gathering these functions, you could gather them and output a separate file that calls each of them, then run the resulting file.
Context: I'm trying to resolve this issue.
In other words, there's a NormalizeJsonString() for JSON strings (see this for more context:
// Takes a value containing JSON string and passes it through
// the JSON parser to normalize it, returns either a parsing
// error or normalized JSON string.
func NormalizeJsonString(jsonString interface{}) (string, error) {
that allows to have the following code:
return structure.NormalizeJsonString(old) == structure.NormalizeJsonString(new)
but it doesn't work for strings that are proto files (all proto files are guaranteed to have just one message definition). For example, I could see:
syntax = "proto3";
- package bar.proto;
+ package bar.proto;
option java_outer_classname = "FooProto";
message Foo {
...
- int64 xyz = 3;
+ int64 xyz = 3;
Is there NormalizeProtoString available in some Go SDKs? I found MessageDifferencer but it's in C++ only. Another option I considered was to replace all new lines / group of whitespaces with a single whitespace but it's a little bit hacky.
To do this in a semantic fashion, the proto definitions should really be parsed. Naively stripping and/or replacing whitespace may get you somewhere, but likely will have gotchas.
As far as I'm aware the latest official Go protobuf package don't have anything to handle parsing protobuf definitions - the protoc compiler handles that side of affairs, and this is written in C++
There would be options to execute the protoc compiler to get hold of the descriptor set output (e.g. protoc --descriptor_set_out=...), however I'm guessing this would also be slightly haphazard considering it requires one to have protoc available - and version differences could potentially cause problems too.
Assuming that is no go, one further option is to use a 3rd party parser written in Go - github.com/yoheimuta/go-protoparser seems to handle things quite well. One slight issue when making comparisons is that the parser records meta information about source line + column positions for each type; however it is relatively easy to make a comparison and ignore these, by using github.com/google/go-cmp
For example:
package main
import (
"fmt"
"log"
"os"
"github.com/google/go-cmp/cmp"
"github.com/google/go-cmp/cmp/cmpopts"
"github.com/yoheimuta/go-protoparser/v4"
"github.com/yoheimuta/go-protoparser/v4/parser"
"github.com/yoheimuta/go-protoparser/v4/parser/meta"
)
func main() {
if err := run(); err != nil {
log.Fatal(err)
}
}
func run() error {
proto1, err := parseFile("example1.proto")
if err != nil {
return err
}
proto2, err := parseFile("example2.proto")
if err != nil {
return err
}
equal := cmp.Equal(proto1, proto2, cmpopts.IgnoreTypes(meta.Meta{}))
fmt.Printf("equal: %t", equal)
return nil
}
func parseFile(path string) (*parser.Proto, error) {
f, err := os.Open(path)
if err != nil {
return nil, err
}
defer f.Close()
return protoparser.Parse(f)
}
outputs:
equal: true
for the example you provided.
I have recently looked into Go plugins instead of manually loading .so files myself.
Basically, I have a game server application, and I want to be able to load plugins (using plugins package) when the server starts. Then, in the plugin itself, I want to be able to call exported functions that are a part of the server.
Say I have this plugin, which is compiled to example_plugin.so using go build -buildmode=plugin:
package main
import "fmt"
func init() {
fmt.Println("Hello from plugin!")
}
Then say I have this server application, which loads the plugin (and ultimately calls the "init" function under the hood):
package main
import (
"fmt"
"plugin"
)
func main() {
fmt.Println("Server started")
if _, err := plugin.Open("example_plugin.so"); err != nil {
panic(err)
}
}
// some API function that loaded plugins can call
func GetPlayers() {}
The output is:
Server started
Hello from plugin!
This works as expected, however I want to be able to call that GetPlayers function (and any other exported functions in the server application, ideally) from the plugin (and any other plugins.) I was thinking about making some sort of library consisting of interfaces containing API functions that the server implements, however I have no idea where to start. I know I will probably need to use a .a file or something similar.
For clarification, I am developing this application for use on Linux, so I am fine with a solution that only works on Linux.
Apologies if this is poorly worded, first time posting on SO.
As mentioned in the comments, there is a Lookup function. In the documentation for the module they have the following example:
// A Symbol is a pointer to a variable or function.
// For example, a plugin defined as
//
// var V int
//
// func F() { fmt.Printf("Hello, number %d\n", V) }
//
// may be loaded with the Open function and then the exported package
// symbols V and F can be accessed
package main
import (
"fmt"
"plugin"
)
func main() {
p, err := plugin.Open("plugin_name.so")
if err != nil {
panic(err)
}
v, err := p.Lookup("V")
if err != nil {
panic(err)
}
f, err := p.Lookup("F")
if err != nil {
panic(err)
}
*v.(*int) = 7
f.(func())() // prints "Hello, number 7"
}
I think the most confusing lines here are
*v.(*int) = 7
f.(func())() // prints "Hello, number 7"
The first one of them performs a type assertion to *int to assert that v is indeed a pointer to int. That is needed since Lookup returns an interface{} and in order to do anything useful with a value, you should clarify its type.
The second line performs another type assertion, this time making sure that f is a function with no arguments and no return values, after which, immediately calls it. Since function F from the original module was referencing V (which we've replaced with 7), this call will display Hello, number 7.
I've been working on a problem and I figured I would demonstrate it using a pokemon setup. I am reading from a file, parsing the file and creating objects/structs from them. This normally isn't a problem except now I need to implement interface like inheriting of traits. I don't want there to be duplicate skills in there so I figured I could use a map to replicate a set data structure. However it seems that in the transitive phase of my recursive parsePokemonFile function (see the implementsComponent case), I appear to be losing values in my map.
I am using the inputs like such:
4 files
Ratatta:
name=Ratatta
skills=Tackle:normal,Scratch:normal
Bulbosaur:
name=Bulbosaur
implements=Ratatta
skills=VineWhip:leaf
Oddish:
name=Oddish
implements=Ratatatt
skills=Acid:poison
Venosaur:
name=Venosaur
implements=bulbosaur,oddish
I'm expecting the output for the following code to be something like
Begin!
{Venosaur [{VineWhip leaf} {Acid poison} {Tackle normal} {Scratch normal}]}
but instead I get
Begin!
{Venosaur [{VineWhip leaf} {Acid poison}]}
What am I doing wrong? Could it be a logic error? Or am I making an assumption about the map holding values that I shouldn't?
package main
import (
"bufio"
"fmt"
"os"
"strings"
)
// In order to create a set of pokemon abilities and for ease of creation and lack of space being taken up
// We create an interfacer capability that imports the skills and attacks from pokemon of their previous evolution
// This reduces the amount of typing of skills we have to do.
// Algorithm is simple. Look for the name "implements=x" and then add x into set.
// Unfortunately it appears that the set is dropping values on transitive implements interfaces
func main() {
fmt.Println("Begin!")
dex, err := parsePokemonFile("Venosaur")
if err != nil {
fmt.Printf("Got error: %v\n", err)
}
fmt.Printf("%v\n", dex)
}
type pokemon struct {
Name string
Skills []skill
}
type skill struct {
SkillName string
Type string
}
func parsePokemonFile(filename string) (pokemon, error) {
file, err := os.Open(filename)
if err != nil {
return pokemon{}, err
}
defer file.Close()
scanner := bufio.NewScanner(file)
var builtPokemon pokemon
for scanner.Scan() {
component, returned := parseLine(scanner.Text())
switch component {
case nameComponent:
builtPokemon.Name = returned
case skillsComponent:
skillsStrings := strings.Split(returned, ",")
var skillsArr []skill
// split skills and add them into pokemon skillset
for _, skillStr := range skillsStrings {
skillPair := strings.Split(skillStr, ":")
skillsArr = append(skillsArr, skill{SkillName: skillPair[0], Type: skillPair[1]})
}
builtPokemon.Skills = append(builtPokemon.Skills, skillsArr...)
case implementsComponent:
implementsArr := strings.Split(returned, ",")
// create set to remove duplicates
skillsSet := make(map[*skill]bool)
for _, val := range implementsArr {
// recursively call the pokemon files and get full pokemon
implementedPokemon, err := parsePokemonFile(val)
if err != nil {
return pokemon{}, err
}
// sieve out the skills into a set
for _, skill := range implementedPokemon.Skills {
skillsSet[&skill] = true
}
}
// append final set into the currently being built pokemon
for x := range skillsSet {
builtPokemon.Skills = append(builtPokemon.Skills, *x)
}
}
}
return builtPokemon, nil
}
type component int
// components to denote where to put our strings when it comes time to assemble what we've parsed
const (
nameComponent component = iota
implementsComponent
skillsComponent
)
func parseLine(line string) (component, string) {
arr := strings.Split(line, "=")
switch arr[0] {
case "name":
return nameComponent, arr[1]
case "implements":
return implementsComponent, arr[1]
case "skills":
return skillsComponent, arr[1]
default:
panic("Invalid field found")
}
}
This has nothing to do with Golang maps dropping any values.
The problem is that you are using a map of skill pointers and not skills. Two pointers to the same skill content can be different.
skillsSet := make(map[*skill]bool)
If you change this to map[skill]bool, this should work. You may try it out!
I am a beginner in Golang.
I have a problem with variable type assigning from user input.
When the user enters data like "2012BV352" I need to be able to ignore the BV and pass 2012352 to my next function.
There has a package name gopkg.in/validator.v2 in doc
But what it returns is whether or not the variable is safe or not.
I need to cut off the unusual things.
Any idea on how to achieve this?
You could write your own sanitizing methods and if it becomes something you'll be using more often, I'd package it out and add other methods to cover more use cases.
I provide two different ways to achieve the same result. One is commented out.
I haven't run any benchmarks so i couldn't tell you for certain which is more performant, but you could write your own tests if you wanted to figure it out. It would also expose another important aspect of Go and in my opinion one of it's more powerful tools... testing.
package main
import (
"fmt"
"log"
"regexp"
"strconv"
"strings"
)
// using a regex here which simply targets all digits and ignores everything else. I make it a global var and use MustCompile because the
// regex doesn't need to be created every time.
var extractInts = regexp.MustCompile(`\d+`)
func SanitizeStringToInt(input string) (int, error) {
m := extractInts.FindAllString(input, -1)
s := strings.Join(m, "")
return strconv.Atoi(s)
}
/*
// if you didn't want to use regex you could use a for loop
func SanitizeStringToInt(input string) (int, error) {
var s string
for _, r := range input {
if !unicode.IsLetter(r) {
s += string(r)
}
}
return strconv.Atoi(s)
}
*/
func main() {
a := "2012BV352"
n, err := SanitizeStringToInt(a)
if err != nil {
log.Fatal(err)
}
fmt.Println(n)
}