One file two different outputs - Windows Server 2012 - windows

My program reads in an sql file and performs operation on a database.
I edited one of the sql files on the server via notepad yesterday.
I made one more change on the same file today, again via notepad.
When program reads in the file, the changes I made to the sql are not there.
Printing the sql contents to the console reveals that the binary is reading in the version from yesterday.
What black magic is at play here?
Deleting the file does not work.
If I create it again the Date created time stamp is from 1 month ago. Date modified is from yesterday.
Opening the file in notepad, wordpad any text reader you can think of shows the correct contents.
Binary reads the version from yesterday.
This is how the binary reads the file
file, err := ioutil.ReadFile("appointment.sql")
if err != nil {
log.Fatal(err)
}
Program was cross compiled for windows on a mac.
Sql files were written originally on a mac via vim and then uploaded to the server.
EDIT: I include the code from the method after suggested debugging.
func (r *Icar) ReadAppointments(qCfg dms.QueryConfig) []dms.Appointment {
// r.conn contains the db connection
/*DEBUGGING*/
name := "appointment.sql"
fmt.Printf("%q\n", name)
path, err := filepath.Abs(name)
if err != nil {
log.Fatal(err)
}
fmt.Printf("%q\n", path) //correct path
file, err := ioutil.ReadFile("appointment.sql")
if err != nil {
log.Fatal(err)
}
fmt.Printf("%q\n", file) //correct output
/*END*/
appointmentQuery := string(file)
fmt.Println(appointmentQuery) //correct output
appointmentQuery = strings.Replace(appointmentQuery, "#", qCfg.QueryLocationID, -1)
fmt.Println(appointmentQuery) //correct output
rows, err := r.conn.Query(appointmentQuery)
if err != nil {
fmt.Println(appointmentQuery) //wrong output. output contains edits from a previous version
log.Fatal("Error reading from the database: %s", err.Error())
}
appointments := []dms.Appointment{}
var (
ExternalID,
WONumber,
CustomerWaiting interface{}
)
for rows.Next() {
appointment := dms.Appointment{}
err = rows.Scan(&ExternalID, &WONumber, &appointment.AppointmentDate, &CustomerWaiting)
if err != nil {
fmt.Println(appointmentQuery)
log.Fatal(err)
}
toStr := []interface{}{ExternalID, WONumber}
toInt := []interface{}{CustomerWaiting}
convertedString := d.ConvertToStr(toStr)
convertedInt := d.ConvertToInt(toInt)
appointment.ExternalID = convertedString[0]
appointment.WONumber = convertedString[1]
appointment.CustomerWaiting = convertedInt[0]
appointments = append(appointments, appointment)
}
err = rows.Close()
return appointments
}
I close the db connection in a deferred statement in my main func.
Here is the constructor for reference
func New(config QueryConfig) (*Icar, func()) {
db, err := sql.Open("odbc", config.Connection)
if err != nil {
log.Fatal("The database doesn't open correctly:\n", err.Error())
}
icar := &Icar{
conn: db,
}
return icar, func() {
icar.conn.Close()
}
}

Basic debugging says check your inputs and outputs. You may be looking at different files. Clearly, "appointment.sql" is not necessarily unique in the file system. For example, does this give you expected results?
package main
import (
"fmt"
"io/ioutil"
"log"
"path/filepath"
)
func main() {
name := "appointment.sql"
fmt.Printf("%q\n", name)
path, err := filepath.Abs(name)
if err != nil {
log.Fatal(err)
}
fmt.Printf("%q\n", path)
file, err := ioutil.ReadFile(name)
if err != nil {
log.Fatal(err)
}
fmt.Printf("%q\n", file)
}
Output:
"appointment.sql"
"C:\\Users\\peter\\gopath\\src\\so\\appointment.sql"
"SELECT * FROM appointments;\n"

Related

Golang: Facing error while creating .tar.gz file having large name

I am trying to create a .tar.gz file from folder that contains multiple files / folders. Once the .tar.gz file gets created, while extracting, the files are not not properly extracted. Mostly I think its because of large names or path exceeding some n characters, because same thing works when the filename/path is small. I referred this https://github.com/golang/go/issues/17630 and tried to add below code but it did not help.
header.Uid = 0
header.Gid = 0
I am using simple code seen below to create .tar.gz. The approach is, I create a temp folder, do some processing on the files and from that temp path, I create the .tar.gz file hence in the path below I am using pre-defined temp folder path.
package main
import (
"archive/tar"
"compress/gzip"
"fmt"
"io"
"log"
"os"
fp "path/filepath"
)
func main() {
// Create output file
out, err := os.Create("output.tar.gz")
if err != nil {
log.Fatalln("Error writing archive:", err)
}
defer out.Close()
// Create the archive and write the output to the "out" Writer
tmpDir := "C:/Users/USERNAME~1/AppData/Local/Temp/temp-241232063"
err = createArchive1(tmpDir, out)
if err != nil {
log.Fatalln("Error creating archive:", err)
}
fmt.Println("Archive created successfully")
}
func createArchive1(path string, targetFile *os.File) error {
gw := gzip.NewWriter(targetFile)
defer gw.Close()
tw := tar.NewWriter(gw)
defer tw.Close()
// walk through every file in the folder
err := fp.Walk(path, func(filePath string, info os.FileInfo, err error) error {
// ensure the src actually exists before trying to tar it
if _, err := os.Stat(filePath); err != nil {
return err
}
if err != nil {
return err
}
if info.IsDir() {
return nil
}
file, err := os.Open(filePath)
if err != nil {
return err
}
defer file.Close()
// generate tar header
header, err := tar.FileInfoHeader(info, info.Name())
header.Uid = 0
header.Gid = 0
if err != nil {
return err
}
header.Name = filePath //strings.TrimPrefix(filePath, fmt.Sprintf("%s/", fp.Dir(path))) //info.Name()
// write header
if err := tw.WriteHeader(header); err != nil {
return err
}
if _, err := io.Copy(tw, file); err != nil {
return err
}
return nil
})
return err
}
Please let me know what wrong I am doing.

Can I read Google Sheet as CSV file

Using Go language, is there a way that I can read the date saved at GoolgeSheets as CSV file, without downloading offline copy of the file?
Yes, this is possible, with the below steps:
In Googlesheets:
Publish the sheet under consideration as csv file, using File -> Publish to the web, make sure to select the option "Automatically republish when changes are made"
Copy the link provided by googleSheets for the csv connectivity
In Go lang
Use the below code:
// file main.go
package main
import (
"encoding/csv"
"fmt"
"net/http"
)
func readCSVFromURL(url string) ([][]string, error) {
resp, err := http.Get(url)
if err != nil {
return nil, err
}
defer resp.Body.Close()
reader := csv.NewReader(resp.Body)
reader.Comma = ','
data, err := reader.ReadAll()
if err != nil {
return nil, err
}
return data, nil
}
func main() {
url := "https://docs.google.com/spreadsheets/d/e/xxxxxsingle=true&output=csv"
data, err := readCSVFromURL(url)
if err != nil {
panic(err)
}
for idx, row := range data {
// skip header
if idx == 0 {
continue
}
if idx == 6 {
break
}
fmt.Println(row[2])
}
}

Uploading for to internet site

With the below code I can download a file from internet asking with monitoring the downloaded percentage.
How can I do something to upload file to internet as well as monitoring the upload progress. I want to upload executable file at github assets
package main
import (
"fmt"
"io"
"net/http"
"os"
"strings"
"github.com/dustin/go-humanize"
)
// WriteCounter counts the number of bytes written to it. It implements to the io.Writer interface
// and we can pass this into io.TeeReader() which will report progress on each write cycle.
type WriteCounter struct {
Total uint64
}
func (wc *WriteCounter) Write(p []byte) (int, error) {
n := len(p)
wc.Total += uint64(n)
wc.PrintProgress()
return n, nil
}
func (wc WriteCounter) PrintProgress() {
// Clear the line by using a character return to go back to the start and remove
// the remaining characters by filling it with spaces
fmt.Printf("\r%s", strings.Repeat(" ", 35))
// Return again and print current status of download
// We use the humanize package to print the bytes in a meaningful way (e.g. 10 MB)
fmt.Printf("\rDownloading... %s complete", humanize.Bytes(wc.Total))
}
func main() {
fmt.Println("Download Started")
fileUrl := "https://upload.wikimedia.org/wikipedia/commons/d/d6/Wp-w4-big.jpg"
err := DownloadFile("avatar.jpg", fileUrl)
if err != nil {
panic(err)
}
fmt.Println("Download Finished")
}
// DownloadFile will download a url to a local file. It's efficient because it will
// write as it downloads and not load the whole file into memory. We pass an io.TeeReader
// into Copy() to report progress on the download.
func DownloadFile(filepath string, url string) error {
// Create the file, but give it a tmp file extension, this means we won't overwrite a
// file until it's downloaded, but we'll remove the tmp extension once downloaded.
out, err := os.Create(filepath + ".tmp")
if err != nil {
return err
}
// Get the data
resp, err := http.Get(url)
if err != nil {
out.Close()
return err
}
defer resp.Body.Close()
// Create our progress reporter and pass it to be used alongside our writer
counter := &WriteCounter{}
if _, err = io.Copy(out, io.TeeReader(resp.Body, counter)); err != nil {
out.Close()
return err
}
// The progress use the same line so print a new line once it's finished downloading
fmt.Print("\n")
// Close the file without defer so it can happen before Rename()
out.Close()
if err = os.Rename(filepath+".tmp", filepath); err != nil {
return err
}
return nil
}
I just modify your code. It works for my file server.
func UploadFile(filepath string, url string) error {
// Create the file, but give it a tmp file extension, this means we won't overwrite a
// file until it's downloaded, but we'll remove the tmp extension once downloaded.
out, err := os.Open(filepath)
if err != nil {
return err
}
// Create our progress reporter and pass it to be used alongside our writer
counter := &WriteCounter{}
// Get the data
resp, err := http.Post(url, "multipart/form-data", io.TeeReader(out, counter))
if err != nil {
out.Close()
log.Println(err.Error())
return err
}
defer resp.Body.Close()
// The progress use the same line so print a new line once it's finished downloading
fmt.Print("\n")
// Close the file without defer so it can happen before Rename()
out.Close()
return nil
}

Editing a zip file in memory

I am trying to edit a zip file in memory in Go and return the zipped file through a HTTP response
The goal is to add a few files to a path in the zip file example
I add a log.txt file in my path/to/file route in the zipped folder
All this should be done without saving the file or editing the original file.
I have implemented a simple version of real-time stream compression, which can correctly compress a single file. If you want it to run efficiently, you need a lot of optimization.
This is only for reference. If you need more information, you should set more useful HTTP header information before compression so that the client can correctly process the response data.
package main
import (
"archive/zip"
"io"
"net/http"
"os"
"github.com/gin-gonic/gin"
)
func main() {
engine := gin.Default()
engine.GET("/log.zip", func(c *gin.Context) {
f, err := os.Open("./log.txt")
if err != nil {
c.String(http.StatusInternalServerError, err.Error())
return
}
defer f.Close()
info, err := f.Stat()
if err != nil {
c.String(http.StatusInternalServerError, err.Error())
return
}
z := zip.NewWriter(c.Writer)
head, err := zip.FileInfoHeader(info)
if err != nil {
c.String(http.StatusInternalServerError, err.Error())
return
}
defer z.Close()
w, err := z.CreateHeader(head)
if err != nil {
c.String(http.StatusInternalServerError, err.Error())
return
}
_, err = io.Copy(w, f)
if err != nil {
c.String(http.StatusInternalServerError, err.Error())
return
}
})
engine.Run("127.0.0.1:8080")
}
So after hours of tireless work i figured out my approach was bad or maybe not possible with the level of my knowledge so here is a not so optimal solution but it works and fill ur file is not large it should be okay for you.
So you have a file template.zip and u want to add extra files, my initial approach was to copy the whole file into memory and edit it from their but i was having complications.
My next approach was to recreate the file in memory, file by file and to do that i need to know every file in the directory i used the code below to get all my files into a list
root := "template"
err = filepath.Walk(root, func(path string, info os.FileInfo, err error) error {
if info.IsDir() {
return nil
}append(files,path)}
now i have all my files and i can create a buffer to hold all this files
buf := new(bytes.Buffer)
// Create a new zip archive.
zipWriter := zip.NewWriter(buf)
now with the zip archive i can write all my old files to it while at the same time copying the contents
for _, file := range files {
zipFile, err := zipWriter.Create(file)
if err != nil {
fmt.Println(err)
}
content, err := ioutil.ReadFile(file)
if err != nil {
log.Fatal(err)
}
// Convert []byte to string and print to screen
// text := string(content)
_, err = zipFile.Write(content)
if err != nil {
fmt.Println(err)
}
}
At this point, we have our file in buf.bytes()
The remaining cold adds the new files and sends the response back to the client
for _, appCode := range appPageCodeText {
f, err := zipWriter.Create(filepath.fileextension)
if err != nil {
log.Fatal(err)
}
_, err = f.Write([]byte(appCode.Content))
}
err = zipWriter.Close()
if err != nil {
fmt.Println(err)
}
w.Header().Set("Content-Disposition", "attachment; filename="+"template.zip")
w.Header().Set("Content-Type", "application/zip")
w.Write(buf.Bytes()) //'Copy' the file to the client

Filling out the form fields in docx using golang library unioffice

I'm trying to fill out the form fields using unioffice library. The document that i'm working with, contains several paragraphs. The paragraphs contains several form fields.
I want to fill out all of the form fields in the document. And here is the code i'm running:
doc, err := document.Open("form.docx")
if err != nil {
log.Fatalf("error opening form: %s", err)
}
for i := range doc.FormFields() {
doc.FormFields()[i].SetValue("test")
}
doc.SaveToFile("filled-form.docx")
However, not all of the form fields were filled out.
Looks to me like a bug in func (d *Document) Save(w io.Writer) error{}. I can read and write to every of the FormFields but only the last FormField value in the paragraph gets actually saved to the file.
Below code works like expected until you save to file. (That means it prints out the previously set value). I saw you already opened a new issue on github (link) i hope you have more luck with that.
package main
import (
"github.com/unidoc/unioffice/document"
"io/ioutil"
"log"
"os"
)
func main() {
_, err := ioutil.ReadFile("filled-form.docx")
if err == nil {
err = os.Remove("filled-form.docx")
if err != nil {
log.Fatal(err)
}
}
doc, err := document.Open("form.docx")
if err != nil {
log.Fatalf("error opening form: %s", err)
}
for _, f := range doc.FormFields() {
if f.Type() == document.FormFieldType(1) {
f.SetValue("test")
}
}
for _, f := range doc.FormFields() {
log.Println("-------------------")
log.Println(f.Name())
log.Println(f.Value())
}
err = doc.SaveToFile("filled-form.docx")
if err != nil {
log.Fatal(err)
}
}

Resources