Decode JSON to map[string]map[string]string - go

I have a map[string]map[string]string that I'd like to be able to convert to JSON and write to a file, and be able to read the data back in from the file.
I've been able to successfully write to the file using the following:
func (l *Locker) Save(filename string) error {
file, err := os.Create(filename)
if err != nil {
return err
}
defer file.Close()
encoder := json.NewEncoder(file)
// l.data is of type map[string]map[string]string
return encoder.Encode(l.data)
}
I'm having trouble loading the JSON back into the map. I've tried the following:
func (l *Locker) Load(filename string) error {
file, err := os.Open(filename)
if err != nil {
return err
}
defer file.Close()
decoder := json.NewDecoder(file)
return decoder.Decode(l.data)
}
loading a JSON file with contents {"bar":{"hello":"world"},"foo":{"bar":"new","baz":"extra"}}, and after the above the contents of l.data is just map[]. How can I successfully decode this JSON into l.data?

If you use json.Unmarshal() instead you can pass it a data structure to populate. Here's a link to the code below, in the playground.
package main
import (
"fmt"
"encoding/json"
)
func main() {
src_json := []byte(`{"bar":{"hello":"world"},"foo":{"bar":"new","baz":"extra"}}`)
d := map[string]map[string]string{}
_ = json.Unmarshal(src_json, &d)
// Print out structure
for k, v := range d {
fmt.Printf("%s\n", k)
for k2, v2 := range v {
fmt.Printf("\t%s: %s\n", k2, v2)
}
}
fmt.Println("Hello, playground")
}

Related

Zip a slice of byte into another slice in Golang

I want to achieve exactly opposite of the solution given here, zipping a slice of byte into another slice of byte -
Convert zipped []byte to unzip []byte golang code
Something like -
func ZipBytes(unippedBytes []byte) ([]byte, error) {
// ...
}
[I am going to upload that zipped file as multipart form data for a POST request]
You can compress directly into memory using a bytes.Buffer.
The following example uses compress/zlib since it is the opposite of the example given in the question. Depending on your use case you could easily change it to compress/gzip as well (very similar APIs).
package data_test
import (
"bytes"
"compress/zlib"
"io"
"testing"
)
func compress(buf []byte) ([]byte, error) {
var out bytes.Buffer
w := zlib.NewWriter(&out)
if _, err := w.Write(buf); err != nil {
return nil, err
}
if err := w.Close(); err != nil {
return nil, err
}
return out.Bytes(), nil
}
func decompress(buf []byte) (_ []byte, e error) {
r, err := zlib.NewReader(bytes.NewReader(buf))
if err != nil {
return nil, err
}
defer func() {
if err := r.Close(); e == nil {
e = err
}
}()
return io.ReadAll(r)
}
func TestRoundtrip(t *testing.T) {
want := []byte("test data")
zdata, err := compress(want)
if err != nil {
t.Fatalf("compress: %v", err)
}
got, err := decompress(zdata)
if err != nil {
t.Fatalf("decompress: %v", err)
}
if !bytes.Equal(want, got) {
t.Errorf("roundtrip: got = %q; want = %q", got, want)
}
}

Generate .pb file without using protoc in golang

I'm trying to generate .pb.go file using service.proto as file input in Go.
Is there a way to do it without using protoc binary (like directly using package github.com/golang/protobuf/protoc-gen-go)?
If you have a detail.proto like this:
message AppDetails {
optional string version = 4;
}
You can parse it into a message like this:
package main
import (
"fmt"
"github.com/golang/protobuf/proto"
"github.com/jhump/protoreflect/desc/protoparse"
"github.com/jhump/protoreflect/dynamic"
)
func parse(file, msg string) (*dynamic.Message, error) {
var p protoparse.Parser
fd, err := p.ParseFiles(file)
if err != nil {
return nil, err
}
md := fd[0].FindMessage(msg)
return dynamic.NewMessage(md), nil
}
func main() {
b := []byte("\"\vhello world")
m, err := parse("detail.proto", "AppDetails")
if err != nil {
panic(err)
}
if err := proto.Unmarshal(b, m); err != nil {
panic(err)
}
fmt.Println(m) // version:"hello world"
}
However you may notice, this package is still using the old Protobuf V1. I did
find a Pull Request for V2:
https://github.com/jhump/protoreflect/pull/354

How to convert go type Cookie to type string

In go I have a function:
func UrlGET(url string, headers string) string { // inputs are url and headers for a http request
...
req, err := http.NewRequest("GET", url, nil)
...
resp, err := client.Do(req)
defer resp.Body.Close()
if rc := resp.Cookies(); len(rc) > 0 {
return string(rc)
}
return ""
}
However, you cannot convert type Cookie ([]*http.Cookie to type string (cannot convert rc (type []*http.Cookie) to type string). What would be an alternative or another way to convert to type string, ideally I would still return type string. I'm relatively new to go so at a bit of a wall as to what else to try.
Ideally, it would return like cookie=some_cookie_value as a string.
If you just want one big string, you can do:
package main
import "net/http"
func main() {
r, e := http.Get("https://stackoverflow.com")
if e != nil {
panic(e)
}
defer r.Body.Close()
s := r.Header.Get("Set-Cookie")
println(s)
}
Or you could build a map:
package main
import (
"fmt"
"net/http"
)
func main() {
r, e := http.Get("https://stackoverflow.com")
if e != nil {
panic(e)
}
defer r.Body.Close()
m := make(map[string]string)
for _, c := range r.Cookies() {
m[c.Name] = c.Value
}
fmt.Println(m)
}
https://golang.org/pkg/net/http#Response.Cookies
https://golang.org/pkg/net/http#Response.Header

Read and merge two Yaml files in go language

Assuming we have two yaml files
master.yaml
someProperty: "someVaue"
anotherProperty: "anotherValue"
override.yaml
someProperty: "overriddenVaue"
Is it possible to unmarshall, merge, and then write those changes to a file without having to define a struct for every property in the yaml file?
The master file has over 500 properties in it that are not at all important to the service at this point of execution, so ideally I'd be able to just unmarshal into a map, do a merge and write out in yaml again but I'm relatively new to go so wanted some opinions.
I've got some code to read the yaml into an interface but i'm unsure on the best approach to then merge the two.
var masterYaml interface{}
yamlBytes, _ := ioutil.ReadFile("master.yaml")
yaml.Unmarshal(yamlBytes, &masterYaml)
var overrideYaml interface{}
yamlBytes, _ = ioutil.ReadFile("override.yaml")
yaml.Unmarshal(yamlBytes, &overrideYaml)
I've looked into libraries like mergo but i'm not sure if that's the right approach.
I'm hoping that after the master I would be able to write out to file with properties
someProperty: "overriddenVaue"
anotherProperty: "anotherValue"
Assuming that you just want to merge at the top level, you can unmarshal into maps of type map[string]interface{}, as follows:
package main
import (
"io/ioutil"
"gopkg.in/yaml.v2"
)
func main() {
var master map[string]interface{}
bs, err := ioutil.ReadFile("master.yaml")
if err != nil {
panic(err)
}
if err := yaml.Unmarshal(bs, &master); err != nil {
panic(err)
}
var override map[string]interface{}
bs, err = ioutil.ReadFile("override.yaml")
if err != nil {
panic(err)
}
if err := yaml.Unmarshal(bs, &override); err != nil {
panic(err)
}
for k, v := range override {
master[k] = v
}
bs, err = yaml.Marshal(master)
if err != nil {
panic(err)
}
if err := ioutil.WriteFile("merged.yaml", bs, 0644); err != nil {
panic(err)
}
}
For a broader solution (with n input files), you can use this function. I have used #robox answer to do my solution:
func ReadValues(filenames ...string) (string, error) {
if len(filenames) <= 0 {
return "", errors.New("You must provide at least one filename for reading Values")
}
var resultValues map[string]interface{}
for _, filename := range filenames {
var override map[string]interface{}
bs, err := ioutil.ReadFile(filename)
if err != nil {
log.Info(err)
continue
}
if err := yaml.Unmarshal(bs, &override); err != nil {
log.Info(err)
continue
}
//check if is nil. This will only happen for the first filename
if resultValues == nil {
resultValues = override
} else {
for k, v := range override {
resultValues[k] = v
}
}
}
bs, err := yaml.Marshal(resultValues)
if err != nil {
log.Info(err)
return "", err
}
return string(bs), nil
}
So for this example you should call it with this order:
result, _ := ReadValues("master.yaml", "overwrite.yaml")
In the case you have an extra file newFile.yaml, you could also use this function:
result, _ := ReadValues("master.yaml", "overwrite.yaml", "newFile.yaml")
DEEP MERGE TWO YAML FILES
package main
import (
"fmt"
"io/ioutil"
"sigs.k8s.io/yaml"
)
func main() {
// declare two map to hold the yaml content
base := map[string]interface{}{}
currentMap := map[string]interface{}{}
// read one yaml file
data, _ := ioutil.ReadFile("conf.yaml")
if err := yaml.Unmarshal(data, &base); err != nil {
}
// read another yaml file
data1, _ := ioutil.ReadFile("conf1.yaml")
if err := yaml.Unmarshal(data1, &currentMap); err != nil {
}
// merge both yaml data recursively
base = mergeMaps(base, currentMap)
// print merged map
fmt.Println(base)
}
func mergeMaps(a, b map[string]interface{}) map[string]interface{} {
out := make(map[string]interface{}, len(a))
for k, v := range a {
out[k] = v
}
for k, v := range b {
if v, ok := v.(map[string]interface{}); ok {
if bv, ok := out[k]; ok {
if bv, ok := bv.(map[string]interface{}); ok {
out[k] = mergeMaps(bv, v)
continue
}
}
}
out[k] = v
}
return out
}

Read text file into string array (and write)

The ability to read (and write) a text file into and out of a string array is I believe a fairly common requirement. It is also quite useful when starting with a language removing the need initially to access a database. Does one exist in Golang?
e.g.
func ReadLines(sFileName string, iMinLines int) ([]string, bool) {
and
func WriteLines(saBuff[]string, sFilename string) (bool) {
I would prefer to use an existing one rather than duplicate.
As of Go1.1 release, there is a bufio.Scanner API that can easily read lines from a file. Consider the following example from above, rewritten with Scanner:
package main
import (
"bufio"
"fmt"
"log"
"os"
)
// readLines reads a whole file into memory
// and returns a slice of its lines.
func readLines(path string) ([]string, error) {
file, err := os.Open(path)
if err != nil {
return nil, err
}
defer file.Close()
var lines []string
scanner := bufio.NewScanner(file)
for scanner.Scan() {
lines = append(lines, scanner.Text())
}
return lines, scanner.Err()
}
// writeLines writes the lines to the given file.
func writeLines(lines []string, path string) error {
file, err := os.Create(path)
if err != nil {
return err
}
defer file.Close()
w := bufio.NewWriter(file)
for _, line := range lines {
fmt.Fprintln(w, line)
}
return w.Flush()
}
func main() {
lines, err := readLines("foo.in.txt")
if err != nil {
log.Fatalf("readLines: %s", err)
}
for i, line := range lines {
fmt.Println(i, line)
}
if err := writeLines(lines, "foo.out.txt"); err != nil {
log.Fatalf("writeLines: %s", err)
}
}
Note: ioutil is deprecated as of Go 1.16.
If the file isn't too large, this can be done with the ioutil.ReadFile and strings.Split functions like so:
content, err := ioutil.ReadFile(filename)
if err != nil {
//Do something
}
lines := strings.Split(string(content), "\n")
You can read the documentation on ioutil and strings packages.
Cannot update first answer.
Anyway, after Go1 release, there are some breaking changes, so I updated as shown below:
package main
import (
"os"
"bufio"
"bytes"
"io"
"fmt"
"strings"
)
// Read a whole file into the memory and store it as array of lines
func readLines(path string) (lines []string, err error) {
var (
file *os.File
part []byte
prefix bool
)
if file, err = os.Open(path); err != nil {
return
}
defer file.Close()
reader := bufio.NewReader(file)
buffer := bytes.NewBuffer(make([]byte, 0))
for {
if part, prefix, err = reader.ReadLine(); err != nil {
break
}
buffer.Write(part)
if !prefix {
lines = append(lines, buffer.String())
buffer.Reset()
}
}
if err == io.EOF {
err = nil
}
return
}
func writeLines(lines []string, path string) (err error) {
var (
file *os.File
)
if file, err = os.Create(path); err != nil {
return
}
defer file.Close()
//writer := bufio.NewWriter(file)
for _,item := range lines {
//fmt.Println(item)
_, err := file.WriteString(strings.TrimSpace(item) + "\n");
//file.Write([]byte(item));
if err != nil {
//fmt.Println("debug")
fmt.Println(err)
break
}
}
/*content := strings.Join(lines, "\n")
_, err = writer.WriteString(content)*/
return
}
func main() {
lines, err := readLines("foo.txt")
if err != nil {
fmt.Println("Error: %s\n", err)
return
}
for _, line := range lines {
fmt.Println(line)
}
//array := []string{"7.0", "8.5", "9.1"}
err = writeLines(lines, "foo2.txt")
fmt.Println(err)
}
You can use os.File (which implements the io.Reader interface) with the bufio package for that. However, those packages are build with fixed memory usage in mind (no matter how large the file is) and are quite fast.
Unfortunately this makes reading the whole file into the memory a bit more complicated. You can use a bytes.Buffer to join the parts of the line if they exceed the line limit. Anyway, I recommend you to try to use the line reader directly in your project (especially if do not know how large the text file is!). But if the file is small, the following example might be sufficient for you:
package main
import (
"os"
"bufio"
"bytes"
"fmt"
)
// Read a whole file into the memory and store it as array of lines
func readLines(path string) (lines []string, err os.Error) {
var (
file *os.File
part []byte
prefix bool
)
if file, err = os.Open(path); err != nil {
return
}
reader := bufio.NewReader(file)
buffer := bytes.NewBuffer(make([]byte, 1024))
for {
if part, prefix, err = reader.ReadLine(); err != nil {
break
}
buffer.Write(part)
if !prefix {
lines = append(lines, buffer.String())
buffer.Reset()
}
}
if err == os.EOF {
err = nil
}
return
}
func main() {
lines, err := readLines("foo.txt")
if err != nil {
fmt.Println("Error: %s\n", err)
return
}
for _, line := range lines {
fmt.Println(line)
}
}
Another alternative might be to use io.ioutil.ReadAll to read in the complete file at once and do the slicing by line afterwards. I don't give you an explicit example of how to write the lines back to the file, but that's basically an os.Create() followed by a loop similar to that one in the example (see main()).
func readToDisplayUsingFile1(f *os.File){
defer f.Close()
reader := bufio.NewReader(f)
contents, _ := ioutil.ReadAll(reader)
lines := strings.Split(string(contents), '\n')
}
or
func readToDisplayUsingFile1(f *os.File){
defer f.Close()
slice := make([]string,0)
reader := bufio.NewReader(f)
for{
str, err := reader.ReadString('\n')
if err == io.EOF{
break
}
slice = append(slice, str)
}

Resources