How can I get the p.template from firestore, which is a string, into template.ParseFiles function? Is it possible to use the field value in the function to select the correct template file?
type Property struct {
Name string `firestore:"name"`
ApprovedOrigins []interface{} `firestore:"approvedOrigins"`
Template string `firestore:"selected"`
}
As you can see above the firestore field name is selected
func serveHandler(w http.ResponseWriter, r *http.Request, params map[string]string) {
ctx := context.Background()
client, err := firestore.NewClient(ctx, projectId)
if err != nil {
// TODO: Handle error.
log.Println("FIREBASE ERROR:", err)
}
// collection group query in firestore
q := client.CollectionGroup("data").Where("approvedOrigins", "array-contains", r.Host).Limit(1)
// iterate through the document query
iter := q.Documents(ctx)
defer iter.Stop()
for {
doc, err := iter.Next()
if err == iterator.Done {
break
}
if err != nil {
// TODO: Handle error.
log.Println("FIREBASE ERROR:", err)
}
fmt.Println("Database connected...")
var p Property
if err := doc.DataTo(&p); err != nil {
fmt.Println(err)
}
fmt.Println(p.Template) // This is not logging any data/string
t, _ := template.ParseFiles(p.Template + ".html")
fmt.Println(t.Execute(w, p)) //504 error happens here
}
}
Related
How I can assign result from rendering template to send this value as json, for ajax call
my example:
func Load(c echo.Context) error {
params := map[string]any{
"btn_style": "btn-light btn-sm",
}
if err := c.Render(http.StatusOK, "test.tmpl", params); err != nil {
return err
}
payload := []byte(fmt.Sprintf(`{"payload":"%s"}`, c.Get("tmpl").(string)))
fmt.Println(string(payload))
return c.JSONBlob(http.StatusOK, (payload))
}
Render function
func (t *TemplateRenderer) Render(w io.Writer, name string, data interface{}, c echo.Context) error {
var buf bytes.Buffer
err := t.templates.ExecuteTemplate(&buf, name, data)
if err != nil {
return err
}
c.Set("tmpl", buf.String())
return t.templates.ExecuteTemplate(w, name, data)
}
I am currently using Go WASM to upload a file to a server. During the upload it shall emit a call to update the upload progress in the UI.
I am currently using the following struct to have an indication of the progress:
type progressReporter struct {
r io.Reader
fileSizeEncrypted int64
sent int64
file js.Value
}
func (pr *progressReporter) Read(p []byte) (int, error) {
n, err := pr.r.Read(p)
pr.sent = pr.sent + int64(n)
pr.report()
return n, err
}
func (pr *progressReporter) report() {
go js.Global().Get("dropzoneObject").Call("emit", "uploadprogress", pr.file, pr.sent*100/pr.fileSizeEncrypted, pr.sent)
}
The upload happens in a promise:
func UploadChunk(this js.Value, args []js.Value) interface{} {
[...]
handler := js.FuncOf(func(this js.Value, args []js.Value) interface{} {
resolve := args[0]
reject := args[1]
go func() {
[...]
body := new(bytes.Buffer)
writer := multipart.NewWriter(body)
part, err := writer.CreateFormFile("file", "encrypted.file")
if err != nil {
return err
}
_, err = part.Write(*data)
if err != nil {
return err
}
err = writer.Close()
if err != nil {
return err
}
pReporter := progressReporter{
r: body,
fileSizeEncrypted: fileSize,
sent: offset,
file: jsFile,
}
r, err := http.NewRequest("POST", "./uploadChunk", &pReporter)
if err != nil {
return err
}
r.Header.Set("Content-Type", writer.FormDataContentType())
client := &http.Client{}
resp, err := client.Do(r)
if err != nil {
return err
}
[...]
}
}
Although the code works fine, all emit calls to update the UI are sent after the POST request is finished. Is there any way to have this call asynchronously?
The full source code can be found here
I'm trying to make an Handler to update one row each time getting data from a submitt button,
here is my code:
func RowHandler(res http.ResponseWriter, req *http.Request) {
if req.Method != "POST" {
http.ServeFile(res, req, "homepage.html")
return
}
Person_id := req.FormValue("Person_id")
stmt, err := db.Prepare("update Cityes set Status='right' where Person_id=?")
if err != nil {
log.Print("error ", err)
}
_, err = stmt.Exec(&Person_id)
t, err := template.ParseFiles("city_update.html") //hier i just want to show a text in html Page
if err != nil {
log.Fatal(err)
}
err = t.Execute(res, "/city_update")
}
Here instead of following
err = t.Execute(res, "/city_update")
pass data to be used to fill your template as send arguement to Execute not string. link to doc
For example .
err = t.Execute(res,struct{ID string}{Person_id})
I am creating a utility that needs to be aware of all the datasets/tables that exist in my BigQuery project. My current code for getting this information is as follows (using Go API):
func populateExistingTableMap(service *bigquery.Service, cloudCtx context.Context, projectId string) (map[string]map[string]bool, error) {
tableMap := map[string]map[string]bool{}
call := service.Datasets.List(projectId)
//call.Fields("datasets/datasetReference")
if err := call.Pages(cloudCtx, func(page *bigquery.DatasetList) error {
for _, v := range page.Datasets {
if tableMap[v.DatasetReference.DatasetId] == nil {
tableMap[v.DatasetReference.DatasetId] = map[string]bool{}
}
table_call := service.Tables.List(projectId, v.DatasetReference.DatasetId)
//table_call.Fields("tables/tableReference")
if err := table_call.Pages(cloudCtx, func(page *bigquery.TableList) error {
for _, t := range page.Tables {
tableMap[v.DatasetReference.DatasetId][t.TableReference.TableId] = true
}
return nil
}); err != nil {
return errors.New("Error Parsing Table")
}
}
return nil
}); err != nil {
return tableMap, err
}
return tableMap, nil
}
For a project with about 5000 datasets, each with up to 10 tables, this code takes almost 15 minutes to return. Is there a faster way to iterate through the names of all existing datasets/tables? I have tried using the Fields method to return only the fields I need (you can see those lines commented out above), but that results in only 50 (exactly 50) of my datasets being returned.
Any ideas?
Here is an updated version of my code, with concurrency, that reduced the processing time from about 15 minutes to 3 minutes.
func populateExistingTableMap(service *bigquery.Service, cloudCtx context.Context, projectId string) (map[string]map[string]bool, error) {
tableMap = map[string]map[string]bool{}
call := service.Datasets.List(projectId)
//call.Fields("datasets/datasetReference")
if err := call.Pages(cloudCtx, func(page *bigquery.DatasetList) error {
var wg sync.WaitGroup
wg.Add(len(page.Datasets))
for _, v := range page.Datasets {
if tableMap[v.DatasetReference.DatasetId] == nil {
tableMap[v.DatasetReference.DatasetId] = map[string]bool{}
}
go func(service *bigquery.Service, datasetID string, projectId string) {
defer wg.Done()
table_call := service.Tables.List(projectId, datasetID)
//table_call.Fields("tables/tableReference")
if err := table_call.Pages(cloudCtx, func(page *bigquery.TableList) error {
for _, t := range page.Tables {
tableMap[datasetID][t.TableReference.TableId] = true
}
return nil // NOTE: returning a non-nil error stops pagination.
}); err != nil {
// TODO: Handle error.
fmt.Println(err)
}
}(service, v.DatasetReference.DatasetId, projectId)
}
wg.Wait()
return nil // NOTE: returning a non-nil error stops pagination.
}); err != nil {
return tableMap, err
// TODO: Handle error.
}
return tableMap, nil
}
I want to write a file cache in Go. I am using gob encoding, and saving to a file, but my get function has some problem:
package main
import (
"encoding/gob"
"fmt"
"os"
)
var (
file = "tmp.txt"
)
type Data struct {
Expire int64
D interface{}
}
type User struct {
Id int
Name string
}
func main() {
user := User{
Id: 1,
Name: "lei",
}
err := set(file, user, 10)
if err != nil {
fmt.Println(err)
return
}
user = User{}
err = get(file, &user)
if err != nil {
fmt.Println(err)
return
}
//user not change.
fmt.Println(user)
}
func set(file string, v interface{}, expire int64) error {
f, err := os.OpenFile(file, os.O_CREATE|os.O_WRONLY|os.O_TRUNC, 0600)
if err != nil {
return err
}
defer f.Close()
//wrapper data
//save v in data.D
data := Data{
Expire: expire,
D: v,
}
gob.Register(v)
enc := gob.NewEncoder(f)
err = enc.Encode(data)
if err != nil {
return err
}
return nil
}
func get(file string, v interface{}) error {
f, err := os.OpenFile(file, os.O_RDONLY, 0600)
if err != nil {
return err
}
defer f.Close()
var data Data
dec := gob.NewDecoder(f)
err = dec.Decode(&data)
if err != nil {
return err
}
//get v
v = data.D
fmt.Println(v)
return nil
}
The get function passes interface type and I want to change the value, but not change.
http://play.golang.org/p/wV7rBH028o
In order to insert an unknown value into v of type interface{}, you need to use reflection. This is somewhat involved, but if you want to support this in full, you can see how its done by walking through the decoding process in some of the encoding packages (json, gob).
To get you started, here's a basic version of your get function using reflection. This skips a number of checks, and will only decode something that was encoded as a pointer.
func get(file string, v interface{}) error {
f, err := os.OpenFile(file, os.O_RDONLY, 0600)
if err != nil {
return err
}
defer f.Close()
rv := reflect.ValueOf(v)
if rv.Kind() != reflect.Ptr || rv.IsNil() {
panic("need a non nil pointer")
}
var data Data
dec := gob.NewDecoder(f)
err = dec.Decode(&data)
if err != nil {
return err
}
dv := reflect.ValueOf(data.D)
if dv.Kind() != reflect.Ptr {
panic("didn't decode a pointer")
}
rv.Elem().Set(dv.Elem())
return nil
}
I would actually suggest an easier way to handle this in your own code, which is to have the Get function return an interface{}. Since you will know what the possible types are at that point, you can use a type switch to assert the correct value.
An alternative approach is to return directly the value from the file:
func get(file string) (interface{}, error) {
f, err := os.OpenFile(file, os.O_RDONLY, 0600)
if err != nil {
return nil, err
}
defer f.Close()
var data Data
dec := gob.NewDecoder(f)
err = dec.Decode(&data)
if err != nil {
return nil,err
}
fmt.Println(data.D)
return data.D,nil
}
full working example: http://play.golang.org/p/178U_LVC5y