Is this the correct behavior of ast parsing - go

I am working on learning how to use and how golang's ast library works. I am parsing https://github.com/modern-go/concurrent, avoiding the test files and the go_below_19.go since it causes errors.
My problem is with the parsing of these lines in the file unbounded_executor.go,
var HandlePanic = func(recovered interface{}, funcName string) {
ErrorLogger.Println(fmt.Sprintf("%s panic: %v", funcName, recovered))
ErrorLogger.Println(string(debug.Stack()))
}
The ast.Ident for ErrorLogger in both instances have a nil obj.
But, I believe that it should not be nil and should reference these lines from log.go,
// ErrorLogger is used to print out error, can be set to writer other than stderr
var ErrorLogger = log.New(os.Stderr, "", 0)
Am I wrong, or is there a problem with the parser? I've followed several references on parsing files and reuse a *token.FileSet across each of the files and use ParseComments as the mode.
edit:
There is a large code base surrounding this, so the code demonstrating this will include snippets.
This is performed with the same fset across all non-test go files, without build restrictions that would stop the code from being used with 1.16
parsedFile, parseErr := parser.ParseFile(fset, filePath, nil, parser.ParseComments)

Call ast.NewPackage to resolve identifiers in the AST:
fset := token.NewFileSet()
files := make(map[string]*ast.File)
for _, name := range []string{"unbounded_executor.go", "log.go"} {
f, err := parser.ParseFile(fset, name, nil, parser.ParseComments)
if err != nil {
log.Fatal(err)
}
files[name] = f
}
ast.NewPackage(fset, files, nil, nil)
ast.Inspect(files["unbounded_executor.go"], func(n ast.Node) bool {
if n, ok := n.(*ast.Ident); ok && n.Name == "ErrorLogger" {
fmt.Println(n.Obj)
}
return true
})
Because a proper importer is not provided and the list of files does not include all files in the package, NewPackage returns unresolved symbol errors.

Related

Why does GO AST Parser regenerate code with extra spaces or indents?

I am trying to regenerate source code from ast of a go program. After regenerating the source code, I am trying to match that with the original source code. But the regenerated source code gives some extra spaces in some places of the code. The code is given below.
fs := token.NewFileSet()
f, err := parser.ParseFile(fs, filename, nil, parser.ParseComments)
if err != nil {
log.Println(err)
}
cfg := printer.Config{Mode: printer.RawFormat}
err = cfg.Fprint(&buffer, fs, f)
if err != nil {
log.Println(err)
}
source_code, err := os.ReadFile(filename)
if err != nil {
log.Println(err)
}
buffer_source_code := buffer.String()
Can someone help me what should I do to get the exact source code as the original from a ast?
Have a look at Decorated Syntax Tree
The dst package enables manipulation of a Go syntax tree with high fidelity. Decorations (e.g. comments and line spacing) remain attached to the correct nodes as the tree is modified.

Golang: statically finding all strings in code

I would like to parse a package and output all of the strings in the code. The specific use case is to collect sql strings and run them through a sql parser, but that's a separate issue.
Is the best way to do this to just parse this line by line? Or is it possible to regex this or something? I imagine that some cases might be nontrivial, such as multiline strings:
str := "This is
the full
string"
// want > This is the full string
Use the go/scanner package to scan for strings in Go source code:
src, err := os.ReadFile(fname)
if err != nil {
/// handle error
}
// Create *token.File to scan.
fset := token.NewFileSet()
file := fset.AddFile(fname, fset.Base(), len(src))
var s scanner.Scanner
s.Init(file, src, nil, 0)
for {
pos, tok, lit := s.Scan()
if tok == token.EOF {
break
}
if tok == token.STRING {
s, _ := strconv.Unquote(lit)
fmt.Printf("%s: %s\n", fset.Position(pos), s)
}
}
https://go.dev/play/p/849QsbqVhho

How to extract .7z files in Go

I have a 7z archive of a number of .txt files. I am trying to list all the files in the archive and upload them to an s3 bucket. But I'm having trouble with extracting .7z archives on Go. To do this, I found a package github.com/gen2brain/go-unarr (imported as extractor) and this is what I have so far
content, err := ioutil.ReadFile("sample_archive.7z")
if err != nil {
fmt.Printf("err: %+v", err)
}
a, err := extractor.NewArchiveFromMemory(content)
if err != nil {
fmt.Printf("err: %+v", err)
}
lst, _ := a.List()
fmt.Printf("lst: %+v", last)
This prints a list of all the files in the archive. But this has two issues.
It reads files from local using ioutil and the input of NewArchiveFromMemory must be of type []byte. But I can't read from local and will have to use a file from memory of type os.file. So I will either have to find a different method or convert the os.file to []byte. There's another method NewArchiveFromReader(r io.Reader). But this is returning an error saying Bad File Descriptor.
file, err := os.OpenFile(
path,
os.O_WRONLY|os.O_TRUNC|os.O_CREATE,
0666,
)
a, err := extractor.NewArchiveFromReader(file)
if err != nil {
fmt.Printf("ERROR: %+v", err)
}
lst, _ := a.List()
fmt.Printf("files: %+v\n", lst)
I am able to get the list of the files in the archive. And using Extract(destinaltion_path string), I can also extract it to a local directory. But I want the extracted files also in os.file format ( ie. a list of os.file since there will be multiple files ).
How can I change my current code to achieve both the above targets? Is there any other library to do this?
os.File implements the io.Reader interface (because it has a Read([]byte) (int, error) method defined), so you can use NewArchiveFromReader(file) without any conversions needed. You can read up on Go interfaces for more background on why that works.
If you're okay with extracting to a local directory, you can do that and then read the files back in (warning, may contain typos):
func extractAndOpenAll(*extractor.Archive) ([]*os.File, error) {
err := a.Extract("/tmp/path") // consider using ioutil.TempDir()
if err != nil {
return nil, err
}
filestats, err := ioutil.ReadDir("/tmp/path")
if err != nil {
return nil, err
}
# warning: all these file handles must be closed by the caller,
# which is why even the error case here returns the list of files.
# if you forget, your process might leak file handles.
files := make([]*os.File, 0)
for _, fs := range(filestats) {
file, err := os.Open(fs.Name())
if err != nil {
return files, err
}
files = append(files, file)
}
return files, nil
}
It is possible to use the archived files without writing back to disk (https://github.com/gen2brain/go-unarr#read-all-entries-from-archive), but whether or not you should do that instead depends on what your next step is.

error: template: "..." is an incomplete or empty template

I'm trying to add a FuncMap to my templates, but I'm receiving the following error:
template: "foo" is an incomplete or empty
template
The parsing of templates worked just fine before I used the FuncMap, so I'm not sure why it's throwing an error now.
Here is my code:
funcMap := template.FuncMap{
"IntToUSD": func(num int) string {
return decimal.New(int64(num), 2).String()
},
}
// ...
tmpl, err := template.New(t.file).Funcs(funcMap).ParseFiles(t.files()...)
if err != nil {
// ...
}
t.files() just returns a slice of strings that are file paths.
Anyone know what's up?
Make sure the argument you pass to template.New is the base name of one of the files in the list you pass to ParseFiles.
One option is
files := t.files()
if len(files) > 0 {
name := path.Base(files[0])
tmpl, err := template.New(name).Funcs(funcMap).ParseFiles(files...)
ParseFiles documentation:
Since the templates created by ParseFiles are named by the base names of the argument files, t should usually have the name of one of the (base) names of the files.
I was having the same problem. I realized that
tmpl, err := template.New("").Funcs(funcMap).ParseFiles("fileName")
also works if you use it with
err := tpl.ExecuteTemplate(wr, "fileName", data)
If I use
err := tpl.Execute(wr, data)
then I should specify the template name in New():
tmpl, err := template.New("fileName").Funcs(funcMap).ParseFiles("fileName")
You can also use Template.Must method,
templ = template.Must(template.New("fileName").Funcs(fm).ParseFiles("fileName"))

Golang - why is string slice element not included in exec cat unless I sort it

I have a slightly funky issue in golang. Essentially I have a slice of strings which represent file paths. I then run a cat against those filepaths to combine the files before sorting, deduping, etc.
here is the section of code (where 'applicableReductions' is the string slice):
applicableReductions := []string{}
for _, fqFromListName := range fqFromListNames {
filePath := GetFilePath()
//BROKE CODE GOES HERE
}
applicableReductions = append(applicableReductions, filePath)
fileOut, err := os.Create(toListWriteTmpFilePath)
if err != nil {
return err
}
cat := exec.Command("cat", applicableReductions...)
catStdOut, err := cat.StdoutPipe()
if err != nil {
return err
}
go func(cat *exec.Cmd) error {
if err := cat.Start(); err != nil {
return fmt.Errorf("File reduction error (cat) : %s", err)
}
return nil
}(cat)
// Init Writer & write file
writer := bufio.NewWriter(fileOut)
defer writer.Flush()
_, err = io.Copy(writer, catStdOut)
if err != nil {
return err
}
if err = cat.Wait(); err != nil {
return err
}
fDiff.StandardiseData(fileOut, toListUpdateFolderPath, list.Name)
The above works fine. The problem comes when I try to append a new ele to the array. I have a seperate function which creates a new file from db content which is then added to the applicableReductions slice.
func RetrieveDomainsFromDB(collection *Collection, listName, outputPath string) error {
domains, err := domainReviews.GetDomainsForList(listName)
if err != nil {
return err
}
if len(domains) < 1 {
return ErrNoDomainReviewsForList
}
fh, err := os.OpenFile(outputPath, os.O_RDWR, 0774)
if err != nil {
fh, err = os.Create(outputPath)
if err != nil {
return err
}
}
defer fh.Close()
_, err = fh.WriteString(strings.Join(domains, "\n"))
if err != nil {
return err
}
return nil
}
If I call the above function and append the filePath to the applicableReduction slice, it is in there but doesnt get called by cat.
To clarify, when I put the following where it says BROKE CODE GOES HERE:
if dbSource {
err = r.RetrieveDomainsFromDB(collection, ToListName, filePath)
if err != nil {
return err
continue
}
}
The filepath can be seen when doing fmt.Println(applicableReductions) but the content of the files contents are not seen in the cat output file.
I thought perhaps a delay in the file being written so i tried adding a time.wait, tis didnt help. However the solution I found was to sort the slice, e.g this code above the call to exec cat solves the problem but I dont know why:
sort.Strings(applicableReductions)
I have confirmed all files present on both successful and unsucessful runs the only difference is without the sort, the content of the final appended file is missing
An explanation from a go-pro out there would be very much appreciated, let me know if you need more info, debug - happy to oblige to understand
UPDATE
It has been suggested that this is the same issue as here: Golang append an item to a slice, I think I understand the issue there and I'm not saying this isnt the same but I cannot see the same thing happenning - the slice in question is not touched from outside the main function (e.g. no editing of the slice in RetrieveDomainsFromDB function), I create the slice before a loop, append to it within a loop and then use it after the loop - Ive added an example at the top to show how the slice is built - please could someone clarify where this slice is being copied if this is the case
UPDATE AND CLOSE
Please close question - the issue was unrelated to the use of a string slice. Turns out that I was reading from the final output file before bufio-writer had been flushed (at end of function before defer flush kicked in on function return)
I think the sorting was just re-arranging the problem so I didnt notice it persisted or possibly giving some time for the buffer to flush. Either way sorted now with a manual call to flush.
Thanks for all help provided

Resources