I'm decoding with gob several objects fetched from a key/value database called "bitcask".
When I try to decode all of them one by one I get the "extra data in buffer" error from the gob decoder but exclusively for the first element that was added in the database and only after I've already fetched them at least once.
What am I doing wrong?
// Just the nest of all the bugs.
type Nest struct {
buf bytes.Buffer
dec *gob.Decoder
enc *gob.Encoder
db *bitcask.Bitcask
sync.Mutex
}
// Saves the bug into the nest.
func (n *Nest) Put(id int64, b Bug) error {
n.Lock()
defer n.Unlock()
if err := n.enc.Encode(b); err != nil {
return err
}
defer n.buf.Reset()
return n.db.Put(itosl(id), n.buf.Bytes())
}
// Retrieves a bug from the nest.
func (n *Nest) Get(id int64) (Bug, error) {
var bg Bug
b, err := n.db.Get(itosl(id))
if err != nil {
return bg, err
}
n.Lock()
defer n.Unlock()
if _, err = n.buf.Write(b); err != nil {
return bg, err
}
defer n.buf.Reset()
return bg, n.dec.Decode(&bg) // error extra data in buffer
}
Note: the "Get" function get's called for every ID in the database everytime the API endpoint is called.
The structs I'm encoding/decoding are the following:
type Comment struct {
Date int64 `json:"date"`
Text string `json:"text"`
Author string `json:"author"`
}
type Bug struct {
Id int64 `json:"id"`
Body string `json:"body"`
Open bool `json:"open"`
Tags []string `json:"tags"`
Date int64 `json:"date"`
Comments []Comment `json:"comments"`
Author string `json:"author"`
}
Also when I try to use a new decoder and buffer each time the "Get" function gets called (as I've seen in an answer to a similar question) the decode operation results in the following error: gob: unknown type id or corrupted data.
Refer to this link for the complete source code:
https://github.com/NicoNex/ladybug
Related
I'm using range to loop through an array of structs to extract data which will be used as a URL parameter for my API calls. Within this loop, I'm trying to push response data from one struct to another.
I'm able to get everything working, except for moving data from one struct to another, but not entirely sure how to solve for the errors I keep getting. I've tried multiple methods and seem to be stuck in the mud here for something I don't consider to be too hard, until now... In my code I'm using the append method but I'm not so sure that might be the correct way to proceed.
Presenting my code:
models.go
//Here is my existing struct, with populated data that I get from a CSV
type TravelItenaries struct {
Origin string
Destination string
Flight_num string
Origin_latitude string
Origin_longitude string
Destination_latitude string
Destination_longitude string
Origin_weather string
Destination_weather string
Coordinates_ori string
Coordinates_dest string
Temp_c_ori string
Temp_f_ori string
Temp_c_dest string
Temp_f_dest string
}
//Here is the response data that I'm expected to get from my API calls.
//I'm trying to "push" Temp_c_dest and Temp_f_dest data into TravelItenaries.Temp_f_dest and TravelItenaries.Temp_c_dest
//While also changing the data types to fit above.
type Response struct {
Current struct {
LastUpdatedEpoch int `json:"last_updated_epoch"`
LastUpdated string `json:"last_updated"`
Temp_c_dest float64 `json:"temp_c"`
Temp_c_dest float64 `json:"temp_f"`
IsDay int `json:"is_day"`
} `json:"current"
}
weather.go
func (s *Server) getWeather(w http.ResponseWriter, r *http.Request) {
// open file
f, err := os.Open("challenge_dataset.csv")
if err != nil {
responses.ERROR(w, http.StatusBadRequest, fmt.Errorf("helpful error"))
return
}
// remember to close the file at the end of the program
defer f.Close()
// read csv values using csv.Reader
csvReader := csv.NewReader(f)
data, err := csvReader.ReadAll()
if err != nil {
responses.ERROR(w, http.StatusBadRequest, fmt.Errorf("helpful error"))
return
}
// convert records to array of structs
travelItenaries := createTravelItenaries(data)
// remove duplicate flight records
cleanTravelItenaries:= remDupKeys(travelItenaries)
// set up params for API get request
params := url.Values{
"key": []string{"xxx"},
"q": []string{""},
}
// Construct URL for API request
u := &url.URL{
Scheme: "https",
Host: "api.weatherapi.com",
Path: "/v1/current.json",
RawQuery: params.Encode(),
}
client := &http.Client{}
// Will need this to populate the params using a range over a struct
values := u.Query()
// loop through cleaned data set
for _, service := range cleanTravelItenaries {
// dynamically acquire data from struct to pass as parameter
values.Set("q", service.Coordinates_dest)
u.RawQuery = values.Encode()
req, err := http.NewRequest("GET", u.String(), nil)
if err != nil {
responses.ERROR(w, http.StatusBadRequest, fmt.Errorf("helpful error"))
return
}
resp, err := client.Do(req)
if err != nil {
responses.ERROR(w, http.StatusBadRequest, fmt.Errorf("helpful error"))
return
}
body, err := ioutil.ReadAll(resp.Body)
// create empty struct to parse response data with using Inmarshal
var responseData models.Response
json.Unmarshal(body, &responseData)
// Here is the issue, I don't think append might be the correct procedure here?
// I simply just need to pass this response data to my already existing struct
service.Temp_c_dest = append(responseData.Current.Temp_c_dest , cleanTravelItenaries )
service.Temp_f_dest = append(responseData.Current.Temp_f_dest , cleanTravelItenaries )
}
}
The errors I get are related to both append statements at the end of the range function.
first argument to append must be slice; have float64
first argument to append must be slice; have float64
for both append methods.
Also, note how type TravelItenaries struct uses string type for:
Temp_c_dest string
Temp_f_dest string
Hence why I also need to do some field type conversion from Float64 to string.
How can I extract the fields Temp_c_dest and Temp_f_dest from API response struct to TravelItenaries struct fields while changing datatypes?
EDIT:
I've managed to get this somewhat working, but only inside the for loop. The data is not being saved outside the function.
service.Temp_f_dest = strconv.FormatFloat(responseData.Current.Temp_f_dest, 'g', -1, 64)
service.Temp_c_dest = strconv.FormatFloat(responseData.Current.Temp_c_dest, 'g', -1, 64)
I am a little bit confused here and although I have searched a lot on this, something is clearly missing from my knowledge and I am asking your help.
I have created a Hyperledger Fabric Network and installed a chaincode in it. And I want to make a function that retrieves all the World State inputs about the Keys. I have done it already with the bytes.Buffer and it worked. But what I want to do is to do it with a struct.
So, I created the following struct that has only the key:
type WSKeys struct {
Key string `json: "key"`
Namespace string `json: "Namespace"`
}
And this is my code function:
func (s *SmartContract) getAllWsDataStruct(APIstub shim.ChaincodeStubInterface , args []string) sc.Response {
var keyArrayStr []WSKeys
resultsIterator, err := APIstub.GetQueryResult("{\"selector\":{\"_id\":{\"$ne\": null }} }")
if err != nil {
return shim.Error("Error occured when trying to fetch data: "+err.Error())
}
for resultsIterator.HasNext() {
// Get the next record
queryResponse, err := resultsIterator.Next()
if err != nil {
return shim.Error(err.Error())
}
fmt.Println(queryResponse)
var qry_key_json WSKeys
json.Unmarshal([]byte(queryResponse), &qry_key_json)
keyArray = append(keyArray, qry_key_json)
}
defer resultsIterator.Close()
all_bytes, _ := json.Marshal(keyArray)
fmt.Println(keyArray)
return shim.Success(all_bytes)
}
When executing the above I get the following error:
cannot convert queryResponse (type *queryresult.KV) to type []byte
I can get the results correctly if I, for example do this:
func (s *SmartContract) getAllWsDataStruct(APIstub shim.ChaincodeStubInterface , args []string) sc.Response {
var keyArray []string
resultsIterator, err := APIstub.GetQueryResult("{\"selector\":{\"_id\":{\"$ne\": null }} }")
if err != nil {
return shim.Error("Error occured when trying to fetch data: "+err.Error())
}
for resultsIterator.HasNext() {
// Get the next record
queryResponse, err := resultsIterator.Next()
if err != nil {
return shim.Error(err.Error())
}
fmt.Println(queryResponse)
keyArray = append(keyArray, queryResponse.Key)
}
defer resultsIterator.Close()
all_bytes, _ := json.Marshal(keyArray)
fmt.Println(keyArray)
return shim.Success(all_bytes)
}
But, why I get the above error when trying to add the queryResponse into a custom struct?
Do I need to add it to a struct that is only its type?
Please someone can explain what I am missing here?
The error statement is verbose enough to indicate, that your []byte conversion failed for the type queryResponse which, with a bit of lookup seems to be a struct type. In Go you cannot natively convert a struct instance to its constituent bytes without encoding using gob or other means.
Perhaps your intention was to use the Key record in the struct for un-marshalling
json.Unmarshal([]byte(queryResponse.Key), &qry_key_json)
I'm trying to build a generic function which will parse input (in JSON) into a specified structure. The structure may vary at run-time, based on parameters which are passed to the function. I'm currently trying to achieve this by passing an object of the right type and using reflect.New() to create a new output object of the same type.
I'm then parsing the JSON into this object, and scanning the fields.
If I create the object and specify the type in code, everything works. If I pass an object and try to create a replica, I get an "invalid indirect" error a few steps down (see code).
import (
"fmt"
"reflect"
"encoding/json"
"strings"
)
type Test struct {
FirstName *string `json:"FirstName"`
LastName *string `json:"LastName"`
}
func genericParser(incomingData *strings.Reader, inputStructure interface{}) (interface{}, error) {
//******* Use the line below and things work *******
//parsedInput := new(Test)
//******* Use vvv the line below and things don't work *******
parsedInput := reflect.New(reflect.TypeOf(inputStructure))
decoder := json.NewDecoder(incomingData)
err := decoder.Decode(&parsedInput)
if err != nil {
//parsing error
return nil, err
}
//******* This is the line that generates the error "invalid indirect of parsedInput (type reflect.Value)" *******
contentValues := reflect.ValueOf(*parsedInput)
for i := 0; i < contentValues.NumField(); i++ {
//do stuff with each field
fmt.Printf("Field name was: %s\n", reflect.TypeOf(parsedInput).Elem().Field(i).Name)
}
return parsedInput, nil
}
func main() {
inputData := strings.NewReader("{\"FirstName\":\"John\", \"LastName\":\"Smith\"}")
exampleObject := new(Test)
processedData, err := genericParser(inputData, exampleObject)
if err != nil {
fmt.Println("Parsing error")
} else {
fmt.Printf("Success: %v", processedData)
}
}
If I can't create a replica of the object, then a way of updating / returning the one supplied would be feasible. The key thing is that this function must be completely agnostic to the different structures available.
reflect.New isn't a direct analog to new, as it can't return a specific type, it only can return a reflect.Value. This means that you are attempting to unmarshal into a *reflect.Value, which obviously isn't going to work (even if it did, your code would have passed in **Type, which isn't what you want either).
Use parsedInput.Interface() to get the underlying value after creating the new value to unmarshal into. You then don't need to reflect on the same value a second time, as that would be a reflect.Value of a reflect.Value, which again isn't going to do anything useful.
Finally, you need to use parsedInput.Interface() before you return, otherwise you are returning the reflect.Value rather than the value of the input type.
For example:
func genericParser(incomingData io.Reader, inputStructure interface{}) (interface{}, error) {
parsedInput := reflect.New(reflect.TypeOf(inputStructure).Elem())
decoder := json.NewDecoder(incomingData)
err := decoder.Decode(parsedInput.Interface())
if err != nil {
return nil, err
}
for i := 0; i < parsedInput.Elem().NumField(); i++ {
fmt.Printf("Field name was: %s\n", parsedInput.Type().Elem().Field(i).Name)
}
return parsedInput.Interface(), nil
}
https://play.golang.org/p/CzDrj6sgQNt
lets say i have the following json
{
name: "John",
birth_date: "1996-10-07"
}
and i want to decode it into the following structure
type Person struct {
Name string `json:"name"`
BirthDate time.Time `json:"birth_date"`
}
like this
person := Person{}
decoder := json.NewDecoder(req.Body);
if err := decoder.Decode(&person); err != nil {
log.Println(err)
}
which gives me the error parsing time ""1996-10-07"" as ""2006-01-02T15:04:05Z07:00"": cannot parse """ as "T"
if i were to parse it manually i would do it like this
t, err := time.Parse("2006-01-02", "1996-10-07")
but when the time value is from a json string how do i get the decoder to parse it in the above format?
That's a case when you need to implement custom marshal and unmarshal functions.
UnmarshalJSON(b []byte) error { ... }
MarshalJSON() ([]byte, error) { ... }
By following the example in the Golang documentation of json package you get something like:
// First create a type alias
type JsonBirthDate time.Time
// Add that to your struct
type Person struct {
Name string `json:"name"`
BirthDate JsonBirthDate `json:"birth_date"`
}
// Implement Marshaler and Unmarshaler interface
func (j *JsonBirthDate) UnmarshalJSON(b []byte) error {
s := strings.Trim(string(b), "\"")
t, err := time.Parse("2006-01-02", s)
if err != nil {
return err
}
*j = JsonBirthDate(t)
return nil
}
func (j JsonBirthDate) MarshalJSON() ([]byte, error) {
return json.Marshal(time.Time(j))
}
// Maybe a Format function for printing your date
func (j JsonBirthDate) Format(s string) string {
t := time.Time(j)
return t.Format(s)
}
If there are lots of struct and you just implement custom marshal und unmarshal functions, that's a lot of work to do so. You can use another lib instead,such as a json-iterator extension jsontime:
import "github.com/liamylian/jsontime"
var json = jsontime.ConfigWithCustomTimeFormat
type Book struct {
Id int `json:"id"`
UpdatedAt *time.Time `json:"updated_at" time_format:"sql_date" time_utc:"true"`
CreatedAt time.Time `json:"created_at" time_format:"sql_datetime" time_location:"UTC"`
}
I wrote a package for handling yyyy-MM-dd and yyyy-MM-ddThh:mm:ss dates at https://github.com/a-h/date
It uses the type alias approach in the answer above, then implements the MarshalJSON and UnmarshalJSON functions with a few alterations.
// MarshalJSON outputs JSON.
func (d YYYYMMDD) MarshalJSON() ([]byte, error) {
return []byte("\"" + time.Time(d).Format(formatStringYYYYMMDD) + "\""), nil
}
// UnmarshalJSON handles incoming JSON.
func (d *YYYYMMDD) UnmarshalJSON(b []byte) (err error) {
if err = checkJSONYYYYMMDD(string(b)); err != nil {
return
}
t, err := time.ParseInLocation(parseJSONYYYYMMDD, string(b), time.UTC)
if err != nil {
return
}
*d = YYYYMMDD(t)
return
}
It's important to parse in the correct timezone. My code assumes UTC, but you may wish to use the computer's timezone for some reason.
I also found that solutions which involved using the time.Parse function leaked Go's internal mechanisms as an error message which clients didn't find helpful, for example: cannot parse "sdfdf-01-01" as "2006". That's only useful if you know that the server is written in Go, and that 2006 is the example date format, so I put in more readable error messages.
I also implemented the Stringer interface so that it gets pretty printed in log or debug messages.
I would like to keep track of the progress in an io.Pipe. I came up with the following struct ProgressPipeReader which wraps io.PipeReader, storing the progress in bytes inside ProgressPipeReader.progress:
type ProgressPipeReader struct {
io.PipeReader
progress int64
}
func (pr *ProgressPipeReader) Read(data []byte) (int, error) {
n, err := pr.PipeReader.Read(data)
if err == nil {
pr.progress += int64(n)
}
return n, err
}
Unfortunately, I can't seem to use it to wrap an io.PipeReader. The following snippet
pr, pw := io.Pipe()
pr = &ProgressPipeReader{PipeReader: pr}
yields the error
cannot use pr (type *io.PipeReader) as type io.PipeReader in field value
Any hints?
As the error describes, you are trying to store a *io.PipeReader value in a io.PipeReader field. You can fix this by updating your struct definition to the correct type:
type ProgressPipeReader struct {
*io.PipeReader
progress int64
}