I am using sentry for the monitoring of my golang gin project , and i have defined my transcation for 3 APIs as follows
span := sentry.StartSpan(c.Request.Context(), "C", sentry.TransactionName("C"))
defer span.Finish()
span := sentry.StartSpan(c.Request.Context(), "B", sentry.TransactionName("B"))
defer span.Finish()
span := sentry.StartSpan(c.Request.Context(), "A", sentry.TransactionName("A"))
defer span.Finish()
But in the sentry dashboard, I am getting mixed-up data, A and C are being shown in the same row, which should not be the case
can anyone please tell me why this is happening, and how can I avoid it.
Related
I'm studying Go and am a real newbie in this field.
I am facing a problem when I try to copy some value.
What I am doing is:
I want to get some response in [response] using httpRequest.
httpClient := &http.Client{}
response, err := httpClient.Do(req)
if err != nil {
panic(err)
}
After that, I want to save the stored value in response at 'origin.txt'
origin_ ,_:= ioutil.ReadAll(response.Body)
f_, err := os.Create("origin.txt")
f_.Write(origin_);
And I want to get a specific value by using goquery package.
doc, err := goquery.NewDocumentFromReader(response.Body)
if err != nil {
log.Fatal(err)
}
doc.Find(".className").Each(func(i int, s *goquery.Selection) {
w.WriteString("============" + strconv.Itoa(i) + "============")
s.Find("tr").Each(func(i int, s_ *goquery.Selection) {
fmt.Println(s_.Text())
w.WriteString(s_.Text())
})
}
)
But in this case, I can get a value exactly what I want from 2) but cannot get anything from 3).
At first, I think the problem is, the response object at 3) is affected by 2) action. Because it is a reference object.
So I tried to copy it to another object and then do it again.
origin := *response
but, I got the same result as first.
What should I do?
How can I assign a reference value to another one by its value?
Should I request it twice for each attempt?
I actually don't see where you use shared resources between 2 and 3.
However that being said origin := *response won't buy you much. The data (response.Body) is a io.ReadCloser. The ioutil.ReadAll() will consume and store all the data that the stream has. You only get to do this once.
However you have the data stored in origin. If you need another io.Reader for that data (say for case 3), then you can make that byte slice look like an io.Reader again: bytes.NewReader(origin).
Sorry if this is a dumb question. I'm using a service which was built using Elasticsearch client for Go. I run the service and now seems like the elasticsearch server have the index of the data. However, when I tried to query those data with http://129.94.14.234:9200/chromosomes/chromosome/1, I got {"_index":"chromosomes","_type":"chromosome","_id":"1","_version":1,"found":true,"_source":{"id":"1","length":249250621}} I checked that the SQL query from the database have those data. Now the question is that, how do I check that my elasticsearch index have those data? Or If anyone can tell me what might be wrong on the code that'll be great as well.
Here's the code that I assume adding the documents to chromosomes index.
func (c ChromosomeIndexer) AddDocuments(db *sql.DB, client *elastic.Client, coordID int) {
sqlQuery := fmt.Sprintf("SELECT seq_region.name, seq_region.length FROM seq_region WHERE seq_region.`name` REGEXP '^[[:digit:]]{1,2}$|^[xXyY]$|(?i)^mt$' AND seq_region.`coord_system_id` = %d", coordID)
stmtOut, err := db.Prepare(sqlQuery)
check(err)
defer stmtOut.Close()
stmtOut.Query()
rows, err := stmtOut.Query()
defer rows.Close()
check(err)
chromoFn := func(rows *sql.Rows, bulkRequest *elastic.BulkService) {
var name string
var length int
err = rows.Scan(&name, &length)
check(err)
chromo := Chromosome{ID: name, Length: length}
fmt.Printf("chromoID: %s\n", chromo.ID)
req := elastic.NewBulkIndexRequest().
OpType("index").
Index("chromosomes").
Type("chromosome").
Id(chromo.ID).
Doc(chromo)
bulkRequest.Add(req)
}
elasticutil.IterateSQL(rows, client, chromoFn)
}
This service have other index which I can query the data with no problem, I only have problem when querying chromosomes data.
Please let me know if I need to put more code so that I can give a bit more context on the problem, I just started on Go and Elasticsearch, and I tried reading the documentation, but it just leads to more confusion.
I am trying to send HTML emails using Golang, but instead of using the native Golang html/template package I am trying to do it with Pongo2.
In this question: Is it possible to create email templates with CSS in Google App Engine Go?
The user is providing this example, which is using the html/template
var tmpl = template.Must(template.ParseFiles("templates/email.html"))
buff := new(bytes.Buffer)
if err = tmpl.Execute(buff, struct{ Name string }{"Juliet"}); err != nil {
panic(err.Error())
}
msg := &mail.Message{
Sender: "romeo#montague.com",
To: []string{"Juliet <juliet#capulet.org>"},
Subject: "See you tonight",
Body: "...you put here the non-HTML part...",
HTMLBody: buff.String(),
}
c := appengine.NewContext(r)
if err := mail.Send(c, msg); err != nil {
c.Errorf("Alas, my user, the email failed to sendeth: %v", err)
What I am trying to do
var tmpl = pongo2.Must(pongo2.FromFile("template.html"))
buff := new(bytes.Buffer)
tmpl.Execute(buff, pongo2.Context{"data": "best-data"}, w)
The problem here is that pongo2.Execute() only allows to enter the context data and not the buff.
My end goal is to be able to write my templates using Pongo2, and I can render the HTML in a way where I can also use it for sending my emails.
My question is what I am doing it wrong? It's possible what I am trying to achieve? If I can find a way to render that HTML into a buff, I can use it later as part buff.String(), which will allow me to enter it in HTML body.
Use ExecuteWriterUnbuffered instead of Execute:
tmpl.ExecuteWriterUnbuffered(pongo2.Context{"data": "best-data"}, &buff)
Not sure what w is doing in your example. If it's another Writer that you'd like to write too, you can use an io.MultiWriter.
// writes to w2 will go to both buff and w
w2 := io.MultiWriter(&buff, w)
Newbie question:
I want to print various variables of a library (is that the correct name? reflect.TypeOf(servers) gives []lib.Server)
I want to do something like this, but this obviously does not work:
servers, err := GetClient().GetServers() //call to external API
serverVariables := []string{}
serverVariables = append(serverVariables, "Name")
serverVariables = append(serverVariables, "IPAddress")
for _, server := range servers {
for _,element := range serverVariables {
fmt.Println(server.element)
}
}
What I already can do is the following (but I want to do it using the above approach):
servers, err := GetClient().GetServers() //call to external API
for _, server := range servers {
fmt.Println(server.Name)
fmt.Println(server.IPAddress)
}
giving the following output:
ServerNameOne
192.168.0.1
ServerNameTwo
192.168.0.2
Reflection is what you probably want to use:
for _, server := range servers {
v := reflect.ValueOf(server)
for _, element := range serverVariables {
fmt.Println(v.FieldByName(element))
}
}
You should also change serverVariables initialization to be serverVariables := []string{}
Playground example: https://play.golang.org/p/s_kzIJ7-B7
It seems to me that you have experience in some dynamic language like Python or JavaScript. Go is compiled and strongly typed. Besides reflection being slower, when using it compiler can't help you with finding basic errors in your code and what is most important you lose type of the accessed variable.
More info on http://blog.golang.org/laws-of-reflection
So I strongly recommend you to keep your current approach:
for _, server := range servers {
fmt.Println(server.Name)
fmt.Println(server.IPAddress)
}
I am importing data to neo4j using neoism, and I have some issues importing big data, 1000 nodes, would take 8s. here is a part of the code that imports 100nodes.
quite basic code, needs improvement, anyone can help me improve this?
var wg sync.WaitGroup
for _, itemProps := range items {
wg.Add(1)
go func(i interface{}) {
s := time.Now()
cypher := neoism.CypherQuery{
Statement: fmt.Sprintf(`
CREATE (%v)
SET i = {Props}
RETURN i
`, ItemLabel),
Parameters: neoism.Props{"Props": i},
}
if err := database.ExecuteCypherQuery(cypher); err != nil {
utils.Error(fmt.Sprintf("error ImportItemsNeo4j! %v", err))
wg.Done()
return
}
utils.Info(fmt.Sprintf("import Item success! took: %v", time.Since(s)))
wg.Done()
}(itemProps)
}
wg.Wait()
Afaik neoism still uses old APIs, you should use cq instead: https://github.com/go-cq/cq
also you should batch your creates,
i.e. either send multiple statements per request, e.g 100 statements per request
or even better send a list of parameters to a single cypher query:
e.g. {data} is a [{id:1},{id:2},...]
UNWIND {data} as props
CREATE (n:Label) SET n = props