How can I fill the todos-Object with a for-loop?
type Row struct {
Name string
Completed bool
Due time.Time
Rcount string
}
type Rows []Row
todos := Rows{
Row{Name: "Write presentation"},
Row{Name: "Host meetup"},
}
The question is a little hard to follow but try starting out with something following this pattern (error handling omitted for brevity):
rows, _ := db.Query(string, args...)
var Rows []Row
for rows.Next() {
var r Row
rows.Scan(&r.Name, &r.Completed, &r.Due, &r.Rcount)
Rows = append(Rows, r)
}
If you can clarify the question perhaps we can provide better answers
I think you are looking for the builtin function append
Note that it is normally used in combination with an assignment, because it may have to allocate additional memory. A zero value slice works just fine, no need to call make.
steps := []string{"write program", "???", "profit"}
var rows []Row
for _, tasks := range steps {
rows = append(rows, Row{Name: tasks})
}
If you want to loop over a sqlite3 query result, your loop will look different, but the x = append(x, ...) pattern will stay the same
If you know in advance how big your slice is going to be, explicit initialization with make will be more efficient.
var rows = make([]Row, len(steps))
for i, tasks := range steps {
rows[i] = Row{Name: tasks}
}
Related
I want to dynamically populate my internal struct, for an atomic insert. I am new to go so pointers and referencing them is something that I am still learning. I can not figure out why this for each loop is putting the same fields in twice. I tried removing the '&' then I get a cannot use type as *type error, I checked to make sure my loop was hitting every object in the tradeArray, and it is. It looks like it is overwriting the object before it with the last one it loops over. How can I fix this?
func createTrade(w http.ResponseWriter, r *http.Request) {
w.Header().Set("Content-Type", "application/json")
var tradeArray []Trade
if err := json.NewDecoder(r.Body).Decode(&tradeArray); err != nil {
e := Error{Message: "Bad Request - Improper Types Passed"}
w.WriteHeader(http.StatusBadRequest)
_ = json.NewEncoder(w).Encode(e)
return
}
for _, trade := range tradeArray {
internal := InternalTrade{
Id: strconv.Itoa(rand.Intn(1000000)),
Trade: &trade,
}
submit := TradeSubmitted{
TradeId: internal.Id,
ClientTradeId: trade.ClientTradeId ,
}
submitArray = append(submitArray, submit)
trades = append(trades, internal)
}
if err := json.NewEncoder(w).Encode(submitArray); err != nil {
e := Error{Message:"Internal Server Error"}
w.WriteHeader(http.StatusInternalServerError)
_ = json.NewEncoder(w).Encode(e)
return
}
}
edit: I was able to fix my problem by creating a new variable to hold the trade and referencing that variable in the struct creation. I am not sure how this is different that what I was doing above with just referencing the "trade" if someone could explain that I would greatly appreciate it.
for _, trade := range tradeArray {
p := trade
internal := InternalTrade{
Id: strconv.Itoa(rand.Intn(1000000)),
Trade: &p,
}
submit := TradeSubmitted{
TradeId: internal.Id,
ClientTradeId: trade.ClientTradeId ,
}
submitArray = append(submitArray, submit)
trades = append(trades, internal)
}
Let's look at just these parts:
var tradeArray []Trade
// code that fills in `tradeArray` -- correct, and omitted here
for _, trade := range tradeArray {
internal := InternalTrade{
Id: strconv.Itoa(rand.Intn(1000000)),
Trade: &trade,
}
submit := TradeSubmitted{
TradeId: internal.Id,
ClientTradeId: trade.ClientTradeId ,
}
submitArray = append(submitArray, submit)
trades = append(trades, internal)
}
This for loop, as you have seen, doesn't work the way you want. Here's a variant of it that's kind of similar, except that the variable trade has scope that extends beyond the for loop:
var trade Trade
for i := range tradeArray {
trade = tradeArray[i]
internal := InternalTrade{
Id: strconv.Itoa(rand.Intn(1000000)),
Trade: &trade,
}
// do correct stuff with `internal`
}
Note that each internal object points to a single, shared trade variable, whose value gets overwritten on each trip through the loop. The result is that they all point to the one from the last trip through the loop.
Your fix is itself OK: each trip through the loop, you make a new (different) p variable, and use &p, so that each internal.Trade has a different pointer to a different copy. You could also just do trade := trade inside the loop, to create a new unique trade variable. However, in this particular case, it may make the most sense to rewrite the loop this way:
for i := range tradeArray {
internal := InternalTrade{
Id: strconv.Itoa(rand.Intn(1000000)),
Trade: &tradeArray[i],
}
// do correct stuff with `internal`
}
That is, you already have len(tradeArray) different Trade objects: the slice header tradeArray gives you access to each tradeArray[i] instance, stored in the underlying array. You can just point to those directly.
There are various advantages and disadvantages to this approach. The big advantage is that you don't re-copy each trade at all: you just use the ones from the array that the slice header covers, that was allocated inside the json Decode function somewhere. The big disadvantage is that this underlying array cannot be garbage-collected as long as you retain any pointer to any of its elements. That disadvantage may have no cost at all, depending on the structure of the remaining code, but if it is a disadvantage, consider declaring tradeArray as:
var tradeArray []*Trade
so that the json Decode function allocates each one separately, and you can point to them one at a time without forcing the retention of the entire collection.
How do I loop through a slice composed of UUIDS? My values comes from db via rows.Next()
Here's how I'm appending my uuid values to my slice (really don't know if its proper)
type Images struct {
image_id uuid.UUID `gorm:"type:uuid;primary_key;"`
}
var new_images []Images
for olds.Next() {
olds.Scan(&oldimages.image_id)
new_images = append(new_images , Images{image_id: oldimages.image_id})
}
olds here is the rows im getting from gorm Rows
olds, err := db.Raw("SELECT images_received.image_id FROM old_pics").Rows()
defer olds.Close()
Heres the function in looping I was given but its for int i dont know how to use this for uuid:
func islice(s []int, n int, f func([]int)) {
for i := 0; i < len(s); i += n {
var section []int
if i > len(s)-n {
section = s[i:]
} else {
section = s[i : i+n]
}
f(section)
}
}
Any idea how I do this? Currently for uuid im using the "github.com/satori/go.uuid" lib
I got the function from another SO question, My goal is to iterate over the rows, but rows.Next() doesnt allow that I guess in order to do that I thought I needed to append them into a slice, so I can get them by fours.
Hence leading to this question.
All you need to do is replace []int with []uuid.UUID everywhere in your islice function, including the parameter types. The functionality of islice() is not bound to []int if thats what your problem is.
I am using the Go flatbuffers interface for the first time. I find the instructions sparse.
I would like to write a vector of uint64s into a table. Ideally, I would like to store numbers directly in a vector without knowing how many there are up front (I'm reading them from sql.Rows iterator). I see the generated code for the table has functions:
func DatasetGridAddDates(builder *flatbuffers.Builder, dates flatbuffers.UOffsetT) {
builder.PrependUOffsetTSlot(2, flatbuffers.UOffsetT(dates), 0)
}
func DatasetGridStartDatesVector(builder *flatbuffers.Builder, numElems int) flatbuffers.UOffsetT {
return builder.StartVector(8, numElems, 8)
}
Can I first write the vector using (??), then use DatasetGridAddDates to record the resulting vector in the containing "DatasetGrid" table?
(caveat: I have not heard of FlatBuffers prior to reading your question)
If you do know the length in advance, storing a vector is done as explained in the tutorial:
name := builder.CreateString("hello")
q55310927.DatasetGridStartDatesVector(builder, len(myDates))
for i := len(myDates) - 1; i >= 0; i-- {
builder.PrependUint64(myDates[i])
}
dates := builder.EndVector(len(myDates))
q55310927.DatasetGridStart(builder)
q55310927.DatasetGridAddName(builder, name)
q55310927.DatasetGridAddDates(builder, dates)
grid := q55310927.DatasetGridEnd(builder)
builder.Finish(grid)
Now what if you don’t have len(myDates)? On a toy example I get exactly the same output if I replace StartDatesVector(builder, len(myDates)) with StartDatesVector(builder, 0). Looking at the source code, it seems like the numElems may be necessary for alignment and for growing the buffer. I imagine alignment might be moot when you’re dealing with uint64, and growing seems to happen automatically on PrependUint64, too.
So, try doing it without numElems:
q55310927.DatasetGridStartDatesVector(builder, 0)
var n int
for rows.Next() { // use ORDER BY to make them go in reverse order
var date uint64
if err := rows.Scan(&date); err != nil {
// ...
}
builder.PrependUint64(date)
n++
}
dates := builder.EndVector(n)
and see if it works on your data.
Is there a way to write a generic array/slice deduplication in go, for []int we can have something like (from http://rosettacode.org/wiki/Remove_duplicate_elements#Go ):
func uniq(list []int) []int {
unique_set := make(map[int] bool, len(list))
for _, x := range list {
unique_set[x] = true
}
result := make([]int, len(unique_set))
i := 0
for x := range unique_set {
result[i] = x
i++
}
return result
}
But is there a way to extend it to support any array? with a signature like:
func deduplicate(a []interface{}) []interface{}
I know that you can write that function with that signature, but then you can't actually use it on []int, you need to create a []interface{} put everything from the []int into it, pass it to the function then get it back and put it into a []interface{} and go through this new array and put everything in a new []int.
My question is, is there a better way to do this?
While VonC's answer probably does the closest to what you really want, the only real way to do it in native Go without gen is to define an interface
type IDList interface {
// Returns the id of the element at i
ID(i int) int
// Returns the element
// with the given id
GetByID(id int) interface{}
Len() int
// Adds the element to the list
Insert(interface{})
}
// Puts the deduplicated list in dst
func Deduplicate(dst, list IDList) {
intList := make([]int, list.Len())
for i := range intList {
intList[i] = list.ID(i)
}
uniques := uniq(intList)
for _,el := range uniques {
dst.Insert(list.GetByID(el))
}
}
Where uniq is the function from your OP.
This is just one possible example, and there are probably much better ones, but in general mapping each element to a unique "==able" ID and either constructing a new list or culling based on the deduplication of the IDs is probably the most intuitive way.
An alternate solution is to take in an []IDer where the IDer interface is just ID() int. However, that means that user code has to create the []IDer list and copy all the elements into that list, which is a bit ugly. It's cleaner for the user to wrap the list as an ID list rather than copy, but it's a similar amount of work either way.
The only way I have seen that implemented in Go is with the clipperhouse/gen project,
gen is an attempt to bring some generics-like functionality to Go, with some inspiration from C#’s Linq and JavaScript’s underscore libraries
See this test:
// Distinct returns a new Thing1s slice whose elements are unique. See: http://clipperhouse.github.io/gen/#Distinct
func (rcv Thing1s) Distinct() (result Thing1s) {
appended := make(map[Thing1]bool)
for _, v := range rcv {
if !appended[v] {
result = append(result, v)
appended[v] = true
}
}
return result
}
But, as explained in clipperhouse.github.io/gen/:
gen generates code for your types, at development time, using the command line.
gen is not an import; the generated source becomes part of your project and takes no external dependencies.
You could do something close to this via an interface. Define an interface, say "DeDupable" requiring a func, say, UniqId() []byte, which you could then use to do the removing of dups. and your uniq func would take a []DeDupable and work on it
I'm writing code that allows data access from a database. However, I find myself repeating the same code for similar types and fields. How can I write generic functions for the same?
e.g. what I want to achieve ...
type Person{FirstName string}
type Company{Industry string}
getItems(typ string, field string, val string) ([]interface{}) {
...
}
var persons []Person
persons = getItems("Person", "FirstName", "John")
var companies []Company
cs = getItems("Company", "Industry", "Software")
So you're definitely on the right track with the idea of returning a slice of nil interface types. However, you're going to run into problems when you try accessing specific members or calling specific methods, because you're not going to know what type you're looking for. This is where type assertions are going to come in very handy. To extend your code a bit:
getPerson(typ string, field string, val string) []Person {
slice := getItems(typ, field, val)
output := make([]Person, 0)
i := 0
for _, item := range slice {
// Type assertion!
thing, ok := item.(Person)
if ok {
output = append(output, thing)
i++
}
}
return output
}
So what that does is it performs a generic search, and then weeds out only those items which are of the correct type. Specifically, the type assertion:
thing, ok := item.(Person)
checks to see if the variable item is of type Person, and if it is, it returns the value and true, otherwise it returns nil and false (thus checking ok tells us if the assertion succeeded).
You can actually, if you want, take this a step further, and define the getItems() function in terms of another boolean function. Basically the idea would be to have getItems() run the function pass it on each element in the database and only add that element to the results if running the function on the element returns true:
getItem(critera func(interface{})bool) []interface{} {
output := make([]interface{}, 0)
foreach _, item := range database {
if criteria(item) {
output = append(output, item)
}
}
}
(honestly, if it were me, I'd do a hybrid of the two which accepts a criteria function but also accepts the field and value strings)
joshlf13 has a great answer. I'd expand a little on it though to maintain some additional type safety. instead of a critera function I would use a collector function.
// typed output array no interfaces
output := []string{}
// collector that populates our output array as needed
func collect(i interface{}) {
// The only non typesafe part of the program is limited to this function
if val, ok := i.(string); ok {
output = append(output, val)
}
}
// getItem uses the collector
func getItem(collect func(interface{})) {
foreach _, item := range database {
collect(item)
}
}
getItem(collect) // perform our get and populate the output array from above.
This has the benefit of not requiring you to loop through your interface{} slice after a call to getItems and do yet another cast.