I have the eclipse software ready to go.
What I am looking for is to debug output the return data for calling search.RetrieveFeeds().
So, in essence, I'd be able to see an array of Feed structs so i know it processed the json data file
When I execute I get this error:
main\main.go:10:38: multiple-value search.RetrieveFeeds() in single-value context
I looked here: Multiple values in single-value context but it was a bit over my head.
Here is what I have so far as my architecture:
Data Json File Path
src/data/data.json
Data for Json File
[
{
"site": "npr",
"link": "http://www.npr.org/rss/rss.php?id=1001",
"type": "rss"
},
{
"site": "cnn",
"link": "http://rss.cnn.com/rss/cnn_world.rss",
"type": "rss"
},
{
"site": "foxnews",
"link": "http://feeds.foxnews.com/foxnews/world?format=xml",
"type": "rss"
},
{
"site": "nbcnews",
"link": "http://feeds.nbcnews.com/feeds/topstories",
"type": "rss"
}
]
Main Method File Path
src/main/main.go
Code for Main Method
package main
import (
"fmt"
"search"
)
func main() {
fmt.Println("HellAo")
var feeds = search.RetrieveFeeds()
fmt.Printf("%v",feeds)
}
Search File Path
src/search/feed.go
Search File Code
package search
import (
"encoding/json"
"os"
)
const dataFile = "data/data.json"
type Feed struct {
Name string `json:"site"`
URI string `json:"link"`
Type string `json:"type"`
}
func RetrieveFeeds() ([]*Feed, error){
file, err := os.Open(dataFile)
if err != nil {
return nil, err
}
defer file.Close()
var feeds []*Feed
err = json.NewDecoder(file).Decode(&feeds)
return feeds, err
}
UPDATE
I changed the data json path to:
const dataFile = "src/data/data.json"
And now the debug dump says:
<nil>
Related
EDIT
This is the working code incase someone finds it useful. The title to this question was originally
"How to parse a list fo dicts in golang".
This is title is incorrect because I was referencing terms I'm familiar with in python.
package main
import (
"encoding/json"
"fmt"
"io/ioutil"
"log"
"net/http"
)
//Regional Strut
type Region []struct {
Region string `json:"region"`
Description string `json:"Description"`
ID int `json:"Id"`
Name string `json:"Name"`
Status int `json:"Status"`
Nodes []struct {
NodeID int `json:"NodeId"`
Code string `json:"Code"`
Continent string `json:"Continent"`
City string `json:"City"`
} `json:"Nodes"`
}
//working request and response
func main() {
url := "https://api.geo.com"
// Create a Bearer string by appending string access token
var bearer = "TOK:" + "TOKEN"
// Create a new request using http
req, err := http.NewRequest("GET", url, nil)
// add authorization header to the req
req.Header.Add("Authorization", bearer)
//This is what the response from the API looks like
//regionJson := `[{"region":"GEO:ABC","Description":"ABCLand","Id":1,"Name":"ABCLand [GEO-ABC]","Status":1,"Nodes":[{"NodeId":17,"Code":"LAX","Continent":"North America","City":"Los Angeles"},{"NodeId":18,"Code":"LBC","Continent":"North America","City":"Long Beach"}]},{"region":"GEO:DEF","Description":"DEFLand","Id":2,"Name":"DEFLand","Status":1,"Nodes":[{"NodeId":15,"Code":"NRT","Continent":"Asia","City":"Narita"},{"NodeId":31,"Code":"TYO","Continent":"Asia","City":"Tokyo"}]}]`
//Send req using http Client
client := &http.Client{}
resp, err := client.Do(req)
if err != nil {
log.Println("Error on response.\n[ERROR] -", err)
}
defer resp.Body.Close()
body, err := ioutil.ReadAll(resp.Body)
if err != nil {
log.Println("Error while reading the response bytes:", err)
}
var regions []Region
json.Unmarshal([]byte(body), ®ions)
fmt.Printf("Regions: %+v", regions)
}
Have a look at this playground example for some pointers.
Here's the code:
package main
import (
"encoding/json"
"log"
)
func main() {
b := []byte(`
[
{"key": "value", "key2": "value2"},
{"key": "value", "key2": "value2"}
]`)
var mm []map[string]string
if err := json.Unmarshal(b, &mm); err != nil {
log.Fatal(err)
}
for _, m := range mm {
for k, v := range m {
log.Printf("%s [%s]", k, v)
}
}
}
I reformatted the API response you included because it is not valid JSON.
In Go it's necessary to define types to match the JSON schema.
I don't know why the API appends % to the end of the result so I've ignored that. If it is included, you will need to trim the results from the file before unmarshaling.
What you get from the unmarshaling is a slice of maps. Then, you can iterate over the slice to get each map and then iterate over each map to extract the keys and values.
Update
In your updated question, you include a different JSON schema and this change must be reflect in the Go code by update the types. There are some other errors in your code. Per my comment, I encourage you to spend some time learning the language.
package main
import (
"bytes"
"encoding/json"
"io/ioutil"
"log"
)
// Response is a type that represents the API response
type Response []Record
// Record is a type that represents the individual records
// The name Record is arbitrary as it is unnamed in the response
// Golang supports struct tags to map the JSON properties
// e.g. JSON "region" maps to a Golang field "Region"
type Record struct {
Region string `json:"region"`
Description string `json:"description"`
ID int `json:"id"`
Nodes []Node
}
type Node struct {
NodeID int `json:"NodeId`
Code string `json:"Code"`
}
func main() {
// A slice of byte representing your example response
b := []byte(`[{
"region": "GEO:ABC",
"Description": "ABCLand",
"Id": 1,
"Name": "ABCLand [GEO-ABC]",
"Status": 1,
"Nodes": [{
"NodeId": 17,
"Code": "LAX",
"Continent": "North America",
"City": "Los Angeles"
}, {
"NodeId": 18,
"Code": "LBC",
"Continent": "North America",
"City": "Long Beach"
}]
}, {
"region": "GEO:DEF",
"Description": "DEFLand",
"Id": 2,
"Name": "DEFLand",
"Status": 1,
"Nodes": [{
"NodeId": 15,
"Code": "NRT",
"Continent": "Asia",
"City": "Narita"
}, {
"NodeId": 31,
"Code": "TYO",
"Continent": "Asia",
"City": "Tokyo"
}]
}]`)
// To more closely match your code, create a Reader
rdr := bytes.NewReader(b)
// This matches your code, read from the Reader
body, err := ioutil.ReadAll(rdr)
if err != nil {
// Use Printf to format strings
log.Printf("Error while reading the response bytes\n%s", err)
}
// Initialize a variable of type Response
resp := &Response{}
// Try unmarshaling the body into it
if err := json.Unmarshal(body, resp); err != nil {
log.Fatal(err)
}
// Print the result
log.Printf("%+v", resp)
}
Similar to this question
How to extract schema for avro file in python
Is there a way to read in an avro file in golang without knowing the schema beforehand and extract a schema?
How about something like this (adapted code from https://github.com/hamba/avro/blob/master/ocf/ocf.go):
package main
import (
"github.com/hamba/avro"
"log"
"os"
)
// HeaderSchema is the Avro schema of a container file header.
var HeaderSchema = avro.MustParse(`{
"type": "record",
"name": "org.apache.avro.file.Header",
"fields": [
{"name": "magic", "type": {"type": "fixed", "name": "Magic", "size": 4}},
{"name": "meta", "type": {"type": "map", "values": "bytes"}},
{"name": "sync", "type": {"type": "fixed", "name": "Sync", "size": 16}}
]
}`)
var magicBytes = [4]byte{'O', 'b', 'j', 1}
const (
schemaKey = "avro.schema"
)
// Header represents an Avro container file header.
type Header struct {
Magic [4]byte `avro:"magic"`
Meta map[string][]byte `avro:"meta"`
Sync [16]byte `avro:"sync"`
}
func main() {
r, err := os.Open("path/my.avro")
if err != nil {
log.Fatal(err)
}
defer r.Close()
reader := avro.NewReader(r, 1024)
var h Header
reader.ReadVal(HeaderSchema, &h)
if reader.Error != nil {
log.Println("decoder: unexpected error: %v", reader.Error)
}
if h.Magic != magicBytes {
log.Println("decoder: invalid avro file")
}
schema, err := avro.Parse(string(h.Meta[schemaKey]))
if err != nil {
log.Println(err)
}
log.Println(schema)
}
Both https://github.com/hamba/avro and https://github.com/linkedin/goavro can decode Avro OCF files (which it sounds like is what you have) without an explicit schema file.
Once you've created a new reader/decoder, you can retrieve the metadata, which includes the schema at key avro.schema: https://pkg.go.dev/github.com/hamba/avro/ocf#Decoder.Metadata and https://pkg.go.dev/github.com/linkedin/goavro#OCFReader.MetaData
I am trying to unmarshal a JSON object which has an optional array, I am doing this without an array and this is what I got so far:
import (
"encoding/json"
"fmt"
)
func main() {
jo := `
{
"given_name": "Akshay Raj",
"name": "Akshay",
"country": "New Zealand",
"family_name": "Gollahalli",
"emails": [
"name#example.com"
]
}
`
var raw map[string]interface{}
err := json.Unmarshal([]byte(jo), &raw)
if err != nil {
panic(err)
}
fmt.Println(raw["emails"][0])
}
The emails field might or might not come sometime. I know I can use struct and unmarshal it twice for with and without array. When I try to get the index 0 of raw["emails"][0] I get the following error
invalid operation: raw["emails"][0] (type interface {} does not support indexing)
Is there a way to get the index of the emails field?
Update 1
I can do something like this fmt.Println(raw["emails"].([]interface{})[0]) and it works. Is this the only way?
The easiest way is with a struct. There's no need to unmarshal twice.
type MyStruct struct {
// ... other fields
Emails []string `json:"emails"`
}
This will work, regardless of whether the JSON input contains the emails field. When it is missing, your resulting struct will just have an uninitialized Emails field.
You can use type assertions. The Go tutorial on type assertions is here.
A Go playground link applying type assertions to your problem is here. For ease of reading, that code is replicated below:
package main
import (
"encoding/json"
"fmt"
)
func main() {
jo := `
{
"given_name": "Akshay Raj",
"name": "Akshay",
"country": "New Zealand",
"family_name": "Gollahalli",
"emails": [
"name#example.com"
]
}
`
var raw map[string]interface{}
err := json.Unmarshal([]byte(jo), &raw)
if err != nil {
panic(err)
}
emails, ok := raw["emails"]
if !ok {
panic("do this when no 'emails' key")
}
emailsSlice, ok := emails.([]interface{})
if !ok {
panic("do this when 'emails' value is not a slice")
}
if len(emailsSlice) == 0 {
panic("do this when 'emails' slice is empty")
}
email, ok := (emailsSlice[0]).(string)
if !ok {
panic("do this when 'emails' slice contains non-string")
}
fmt.Println(email)
}
As always you can use additional libraries for work with your json data. For example with gojsonq package it will like so:
package main
import (
"fmt"
"github.com/thedevsaddam/gojsonq"
)
func main() {
json := `
{
"given_name": "Akshay Raj",
"name": "Akshay",
"country": "New Zealand",
"family_name": "Gollahalli",
"emails": [
"name#example.com"
]
}
`
first := gojsonq.New().JSONString(json).Find("emails.[0]")
if first != nil {
fmt.Println(first.(string))
} else {
fmt.Println("There isn't emails")
}
}
I am trying to build aggregation services, to all third party APIs that's I used,
this aggregation services taking json values coming from my main system and it will put this value to key equivalent to third party api key then, aggregation services it will send request to third party api with new json format.
example-1:
package main
import (
"encoding/json"
"fmt"
"log"
"github.com/tidwall/gjson"
)
func main() {
// mapping JSON
mapB := []byte(`
{
"date": "createdAt",
"clientName": "data.user.name"
}
`)
// from my main system
dataB := []byte(`
{
"createdAt": "2017-05-17T08:52:36.024Z",
"data": {
"user": {
"name": "xxx"
}
}
}
`)
mapJSON := make(map[string]interface{})
dataJSON := make(map[string]interface{})
newJSON := make(map[string]interface{})
err := json.Unmarshal(mapB, &mapJSON)
if err != nil {
log.Panic(err)
}
err = json.Unmarshal(dataB, &dataJSON)
if err != nil {
log.Panic(err)
}
for i := range mapJSON {
r := gjson.GetBytes(dataB, mapJSON[i].(string))
newJSON[i] = r.Value()
}
newB, err := json.MarshalIndent(newJSON, "", " ")
if err != nil {
log.Println(err)
}
fmt.Println(string(newB))
}
output:
{
"clientName": "xxx",
"date": "2017-05-17T08:52:36.024Z"
}
I use gjson package to get values form my main system request in simple way from a json document.
example-2:
import (
"encoding/json"
"fmt"
"log"
"github.com/tidwall/gjson"
)
func main() {
// mapping JSON
mapB := []byte(`
{
"date": "createdAt",
"clientName": "data.user.name",
"server":{
"google":{
"date" :"createdAt"
}
}
}
`)
// from my main system
dataB := []byte(`
{
"createdAt": "2017-05-17T08:52:36.024Z",
"data": {
"user": {
"name": "xxx"
}
}
}
`)
mapJSON := make(map[string]interface{})
dataJSON := make(map[string]interface{})
newJSON := make(map[string]interface{})
err := json.Unmarshal(mapB, &mapJSON)
if err != nil {
log.Panic(err)
}
err = json.Unmarshal(dataB, &dataJSON)
if err != nil {
log.Panic(err)
}
for i := range mapJSON {
r := gjson.GetBytes(dataB, mapJSON[i].(string))
newJSON[i] = r.Value()
}
newB, err := json.MarshalIndent(newJSON, "", " ")
if err != nil {
log.Println(err)
}
fmt.Println(string(newB))
}
output:
panic: interface conversion: interface {} is map[string]interface {}, not string
I can handle this error by using https://golang.org/ref/spec#Type_assertions, but what if this json object have array and inside this array have json object ....
my problem is I have different apis, every api have own json schema, and my way for mapping json only work if
third party api have json key value only, without nested json or array inside this array json object.
is there a way to mapping complex json schema, or golang package to help me to do that?
EDIT:
After comment interaction and with updated question. Before we move forward, I would like to mention.
I just looked at your example-2 Remember one thing. Mapping is from one form to another form. Basically one known format to targeted format. Each data type have to handled. You cannot do generic to generic mapping logically (technically feasible though, would take more time & efforts, you can play around on this).
I have created sample working program of one approach; it does a mapping of source to targeted format. Refer this program as a start point and use your creativity to implement yours.
Playground link: https://play.golang.org/p/MEk_nGcPjZ
Explanation: Sample program achieves two different source format to one target format. The program consist of -
Targeted Mapping definition of Provider 1
Targeted Mapping definition of Provider 2
Provider 1 JSON
Provider 2 JSON
Mapping function
Targeted JSON marshal
Key elements from program: refer play link for complete program.
type MappingInfo struct {
TargetKey string
SourceKeyPath string
DataType string
}
Map function:
func mapIt(mapping []*MappingInfo, parsedResult gjson.Result) map[string]interface{} {
mappedData := make(map[string]interface{})
for _, m := range mapping {
switch m.DataType {
case "time":
mappedData[m.TargetKey] = parsedResult.Get(m.SourceKeyPath).Time()
case "string":
mappedData[m.TargetKey] = parsedResult.Get(m.SourceKeyPath).String()
}
}
return mappedData
}
Output:
Provider 1 Result: map[date:2017-05-17 08:52:36.024 +0000 UTC clientName:provider1 username]
Provider 1 JSON: {
"clientName": "provider1 username",
"date": "2017-05-17T08:52:36.024Z"
}
Provider 2 Result: map[date:2017-05-12 06:32:46.014 +0000 UTC clientName:provider2 username]
Provider 2 JSON: {
"clientName": "provider2 username",
"date": "2017-05-12T06:32:46.014Z"
}
Good luck, happy coding!
Typically Converting/Transforming one structure to another structure, you will have to handle this with application logic.
As you mentioned in the question:
my problem is I have different apis, every api have own json schema
This is true for every aggregation system.
One approach to handle this requirement effectively; is to keep mapping of keys for each provider JSON structure and targeted JSON structure.
For example: This is an approach, please go with your design as you see fit.
JSON structures from various provider:
// Provider 1 : JSON structrure
{
"createdAt": "2017-05-17T08:52:36.024Z",
"data": {
"user": {
"name": "xxx"
}
}
}
// Provider 2 : JSON structrure
{
"username": "yyy"
"since": "2017-05-17T08:52:36.024Z",
}
Mapping for target JSON structure:
jsonMappingByProvider := make(map[string]string)
// Targeted Mapping for Provider 1
jsonMappingByProvider["provider1"] = `
{
"date": "createdAt",
"clientName": "data.user.name"
}
`
// Targeted Mapping for Provider 2
jsonMappingByProvider["provider2"] = `
{
"date": "since",
"clientName": "username"
}
`
Now, based the on the provider you're handling, get the mapping and map the response JSON into targeted structure.
// get the mapping info by provider
mapping := jsonMappingByProvider["provider1"]
// Parse the response JSON
// Do the mapping
This way you can control each provider and it's mapping effectively.
If given the string, from a MediaWiki API request:
str = ` {
"query": {
"pages": {
"66984": {
"pageid": 66984,
"ns": 0,
"title": "Main Page",
"touched": "2012-11-23T06:44:22Z",
"lastrevid": 1347044,
"counter": "",
"length": 28,
"redirect": "",
"starttimestamp": "2012-12-15T05:21:21Z",
"edittoken": "bd7d4a61cc4ce6489e68c21259e6e416+\\"
}
}
}
}`
What can be done to get the edittoken, using Go's json package (keep in mind the 66984 number will continually change)?
When you have a changing key like this the best way to deal with it is with a map. In the example below I've used structs up until the point we reach a changing key. Then I switched to a map format after that. I linked up a working example as well.
http://play.golang.org/p/ny0kyafgYO
package main
import (
"fmt"
"encoding/json"
)
type query struct {
Query struct {
Pages map[string]interface{}
}
}
func main() {
str := `{"query":{"pages":{"66984":{"pageid":66984,"ns":0,"title":"Main Page","touched":"2012-11-23T06:44:22Z","lastrevid":1347044,"counter":"","length":28,"redirect":"","starttimestamp":"2012-12-15T05:21:21Z","edittoken":"bd7d4a61cc4ce6489e68c21259e6e416+\\"}}}}`
q := query{}
err := json.Unmarshal([]byte(str), &q)
if err!=nil {
panic(err)
}
for _, p := range q.Query.Pages {
fmt.Printf("edittoken = %s\n", p.(map[string]interface{})["edittoken"].(string))
}
}
Note that if you use the &indexpageids=true parameter in the API request URL, the result will contain a "pageids" array, like so:
str = ` {
"query": {
"pageids": [
"66984"
],
"pages": {
"66984": {
"pageid": 66984,
"ns": 0,
"title": "Main Page",
"touched": "2012-11-23T06:44:22Z",
"lastrevid": 1347044,
"counter": "",
"length": 28,
"redirect": "",
"starttimestamp": "2012-12-15T05:21:21Z",
"edittoken": "bd7d4a61cc4ce6489e68c21259e6e416+\\"
}
}
}
}`
so you can use pageids[0] to access the continually changing number, which will likely make things easier.