How to bundle an SQLite database in a Go binary? - go

I am try to use go-bindata and packr, but those packages do not show how to pack an SQLite database file in to a binary file.
I don't need to update the database in any way, I just want to read the data from it on startup.
How can I embed an SQLite database file in a Go binary file?

The SQLite driver can't read a database file from memory (e.g. from a byte slice). But you can write the data to a temporary file, and open that:
//go:generate go run gen.go
package main
import (
"database/sql"
"fmt"
"io/ioutil"
"log"
"os"
_ "github.com/mattn/go-sqlite3"
)
func main() {
// Create temporary file for database.
tmpDB, err := ioutil.TempFile("", "db*.sqlite3")
if err != nil {
log.Fatal(err)
}
// Remove this file after on exit.
defer func() {
err := os.Remove(tmpDB.Name())
if err != nil {
log.Print(err)
}
}()
// Write database to file.
_, err = tmpDB.Write(sqlDB)
if err != nil {
log.Print(err)
}
err = tmpDB.Close()
if err != nil {
log.Print(err)
}
// Open DB.
db, err := sql.Open("sqlite3", tmpDB.Name()+"?mode=ro")
if err != nil {
log.Fatal(err)
}
// Make sure it's loaded correct.
rows, err := db.Query("select * from test")
if err != nil {
log.Fatal(err)
}
for rows.Next() {
var c string
err := rows.Scan(&c)
if err != nil {
log.Fatal(err)
}
fmt.Println(c)
}
}
And you can write the database to db.go with something like:
// +build generate
package main
import (
"fmt"
"io/ioutil"
"log"
"os"
"strings"
)
func main() {
// Read source database file.
d, err := ioutil.ReadFile("source.sqlite3")
if err != nil {
log.Fatal(err)
}
fp, err := os.Create("db.go")
if err != nil {
log.Fatal(err)
}
_, err = fmt.Fprintf(fp, "// Code generated by gen.go; DO NOT EDIT.\n\n"+
"package main\n\n"+
"var sqlDB = %s\n", asbyte(d))
if err != nil {
log.Fatal(err)
}
}
// Write any data as byte array.
func asbyte(s []byte) string {
var b strings.Builder
for i, c := range s {
if i%19 == 0 {
b.WriteString("\n\t\t")
}
b.WriteString(fmt.Sprintf("%#x, ", c))
}
return "[]byte{" + b.String() + "}"
}
You can also use go-bindata or packr for that if you prefer, but I don't really see an advantage.
An alternative way is to use a memory database, which may be faster depending on what you want to do.
Embed the SQL schema and rows you want in your Go binary as strings.
Open a new memory database when your program starts (sql.Open("sqlite3",:memory:`) and create the schema and insert the rows.
There is no disk access with this method, so querying it will probably be a bit faster at the expensive of slower startup times (benchmark to be sure!)

Related

Sending binaries or strings by a client socket

I'm studying networks, and I'm doing a tcp server with Go. One of the challenges I'm studying is to send binaries or strings by a socket client to a server, save the server response to a txt, and compare it to the original data that was sent.
The problem is that the binaries do not arrive completely on the server.
Server
package main
import (
"fmt"
"io"
"log"
"net"
)
func main() {
l, err := net.Listen("tcp", ":8000")
if nil != err {
log.Println(err)
}
defer l.Close()
for {
conn, err := l.Accept()
if nil != err {
log.Println(err)
continue
}
defer conn.Close()
go ConnHandler(conn)
}
}
func ConnHandler(conn net.Conn) {
recvBuf := make([]byte, 4096)
for {
n, err := conn.Read(recvBuf)
if nil != err {
if io.EOF == err {
log.Println(err)
return
}
log.Println(err)
return
}
if 0 < n {
data := recvBuf[:n]
fmt.Println(string(data))
}
}
}
Client
package main
import (
"fmt"
"log"
"net"
)
func main() {
conn, err := net.Dial("tcp", ":8000")
if nil != err {
log.Println(err)
}
var s string
fmt.Scanln(&s)
conn.Write([]byte(s))
conn.Close()
}
I'm generating the binaries using the command on linux:
head -c100000 /dev/urandom > binary_message.txt
I run the server:
./server > result.txt
And I send this data by the client using:
./client < binary_data.txt
In the end the file binary_data.txt have 98KB but the result .txt only has 0KB.
The problem is with scanning the binary from input. You didn't see it because the errors were ignored and not printed or otherwise handled. fmt.Scanln returns an error (so does the Write function). You should always check for possible errors happening.
I rewrote the client to load the file from disk itself as I don't think using stdin is a good fit for binary data.
package main
import (
"flag"
"io"
"log"
"net"
"os"
)
var fileName = flag.String("file", "", "file to send")
func main() {
flag.Parse()
conn, err := net.Dial("tcp", ":8000")
if nil != err {
log.Println(err)
}
defer conn.Close()
f, err := os.Open(*fileName)
if err != nil {
log.Println(err)
return
}
defer f.Close()
b := make([]byte, 1024)
for {
n, err := f.Read(b)
if err != nil {
if err == io.EOF {
log.Println("Done sending")
return
}
log.Println(err)
return
}
if _, err := conn.Write(b[:n]); err != nil {
log.Println(err)
return
}
}
}
You can use it with:
go run . -file=binary_message.txt
or if you have built the binary:
./client -file=binary_message.txt
I suggest you do the same for the server. Open a file for writing and write the binary data into that file. Use a flag to pass in the filename to write to. That will be cleaner than piping stdout to a file.

Can I read Google Sheet as CSV file

Using Go language, is there a way that I can read the date saved at GoolgeSheets as CSV file, without downloading offline copy of the file?
Yes, this is possible, with the below steps:
In Googlesheets:
Publish the sheet under consideration as csv file, using File -> Publish to the web, make sure to select the option "Automatically republish when changes are made"
Copy the link provided by googleSheets for the csv connectivity
In Go lang
Use the below code:
// file main.go
package main
import (
"encoding/csv"
"fmt"
"net/http"
)
func readCSVFromURL(url string) ([][]string, error) {
resp, err := http.Get(url)
if err != nil {
return nil, err
}
defer resp.Body.Close()
reader := csv.NewReader(resp.Body)
reader.Comma = ','
data, err := reader.ReadAll()
if err != nil {
return nil, err
}
return data, nil
}
func main() {
url := "https://docs.google.com/spreadsheets/d/e/xxxxxsingle=true&output=csv"
data, err := readCSVFromURL(url)
if err != nil {
panic(err)
}
for idx, row := range data {
// skip header
if idx == 0 {
continue
}
if idx == 6 {
break
}
fmt.Println(row[2])
}
}

Can't find a public file from url in go

I am trying to get the content of a publicly available file using ioutil.ReadFile() but it doesn't find the file: panic: open http://www.pdf995.com/samples/pdf.pdf: No such file or directory
Here's my code:
// Reading and writing files are basic tasks needed for
// many Go programs. First we'll look at some examples of
// reading files.
package main
import (
"fmt"
"io/ioutil"
)
// Reading files requires checking most calls for errors.
// This helper will streamline our error checks below.
func check(e error) {
if e != nil {
panic(e)
}
}
func main() {
fileInUrl, err := ioutil.ReadFile("http://www.pdf995.com/samples/pdf.pdf")
if err != nil {
panic(err)
}
fmt.Printf("HERE --- fileInUrl: %+v", fileInUrl)
}
Here's a go playground example
ioutil.ReadFile() does not support http.
If you look at the source code(https://golang.org/src/io/ioutil/ioutil.go?s=1503:1549#L42), open the file using os.Open.
I think I can do this coding.
package main
import (
"io"
"net/http"
"os"
)
func main() {
fileUrl := "http://www.pdf995.com/samples/pdf.pdf"
if err := DownloadFile("example.pdf", fileUrl); err != nil {
panic(err)
}
}
func DownloadFile(filepath string, url string) error {
// Get the data
resp, err := http.Get(url)
if err != nil {
return err
}
defer resp.Body.Close()
// Create the file
out, err := os.Create(filepath)
if err != nil {
return err
}
defer out.Close()
// Write the body to file
_, err = io.Copy(out, resp.Body)
return err
}
but, go playgound not protocol(go error dial tcp: Protocol not available).
so, You have to do it PC.

How to read a text file? [duplicate]

This question already has answers here:
How can I read a whole file into a string variable
(7 answers)
Closed 4 years ago.
I'm trying to read "file.txt" and put the contents into a variable using Golang. Here is what I've tried...
package main
import (
"fmt"
"os"
"log"
)
func main() {
file, err := os.Open("file.txt")
if err != nil {
log.Fatal(err)
}
fmt.Print(file)
}
The file gets read successfully and the return from os.Open returns a type of *os.File
It depends on what you are trying to do.
file, err := os.Open("file.txt")
fmt.print(file)
The reason it outputs &{0xc082016240}, is because you are printing the pointer value of a file-descriptor (*os.File), not file-content. To obtain file-content, you may READ from a file-descriptor.
To read all file content(in bytes) to memory, ioutil.ReadAll
package main
import (
"fmt"
"io/ioutil"
"os"
"log"
)
func main() {
file, err := os.Open("file.txt")
if err != nil {
log.Fatal(err)
}
defer func() {
if err = file.Close(); err != nil {
log.Fatal(err)
}
}()
b, err := ioutil.ReadAll(file)
fmt.Print(b)
}
But sometimes, if the file size is big, it might be more memory-efficient to just read in chunks: buffer-size, hence you could use the implementation of io.Reader.Read from *os.File
func main() {
file, err := os.Open("file.txt")
if err != nil {
log.Fatal(err)
}
defer func() {
if err = file.Close(); err != nil {
log.Fatal(err)
}
}()
buf := make([]byte, 32*1024) // define your buffer size here.
for {
n, err := file.Read(buf)
if n > 0 {
fmt.Print(buf[:n]) // your read buffer.
}
if err == io.EOF {
break
}
if err != nil {
log.Printf("read %d bytes: %v", n, err)
break
}
}
}
Otherwise, you could also use the standard util package: bufio, try Scanner. A Scanner reads your file in tokens: separator.
By default, scanner advances the token by newline (of course you can customise how scanner should tokenise your file, learn from here the bufio test).
package main
import (
"fmt"
"os"
"log"
"bufio"
)
func main() {
file, err := os.Open("file.txt")
if err != nil {
log.Fatal(err)
}
defer func() {
if err = file.Close(); err != nil {
log.Fatal(err)
}
}()
scanner := bufio.NewScanner(file)
for scanner.Scan() { // internally, it advances token based on sperator
fmt.Println(scanner.Text()) // token in unicode-char
fmt.Println(scanner.Bytes()) // token in bytes
}
}
Lastly, I would also like to reference you to this awesome site: go-lang file cheatsheet. It encompassed pretty much everything related to working with files in go-lang, hope you'll find it useful.

Is it possible to extract a tar.xz package in golang?

Is it possible to extract a tar.xz package in golang? My understanding is it's possible to use the library for tar and sending it to an xz go library.
I recently created an XZ decompression package so it is now
possible to extract a tar.xz using only Go code.
The following code extracts the file myfile.tar.xz to the current
directory:
package main
import (
"archive/tar"
"fmt"
"io"
"log"
"os"
"github.com/xi2/xz"
)
func main() {
// Open a file
f, err := os.Open("myfile.tar.xz")
if err != nil {
log.Fatal(err)
}
// Create an xz Reader
r, err := xz.NewReader(f, 0)
if err != nil {
log.Fatal(err)
}
// Create a tar Reader
tr := tar.NewReader(r)
// Iterate through the files in the archive.
for {
hdr, err := tr.Next()
if err == io.EOF {
// end of tar archive
break
}
if err != nil {
log.Fatal(err)
}
switch hdr.Typeflag {
case tar.TypeDir:
// create a directory
fmt.Println("creating: " + hdr.Name)
err = os.MkdirAll(hdr.Name, 0777)
if err != nil {
log.Fatal(err)
}
case tar.TypeReg, tar.TypeRegA:
// write a file
fmt.Println("extracting: " + hdr.Name)
w, err := os.Create(hdr.Name)
if err != nil {
log.Fatal(err)
}
_, err = io.Copy(w, tr)
if err != nil {
log.Fatal(err)
}
w.Close()
}
}
f.Close()
}
http://golang.org/pkg/archive/tar/#example_
also you can do
import "os/exec"
cmd := exec.Command("tar", "-x", "/your/archive.tar.xz")
err := cmd.Run()
There is no Lempel-Ziv-Markow encoder or decoder in the Go standard library. If you are allowed to assume that the platform your code runs on provides the xz utility, you could use stub functions like these:
import "os/exec"
// decompress xz compressed data stream r.
func UnxzReader(r io.Reader) (io.ReadCloser, error) {
unxz := exec.Command("xz", "-d")
unxz.Stdin = r
out, err := unxz.StdoutPipe()
if err != nil {
return nil, err
}
err = unxz.Start()
if err != nil {
return nil, err
}
// we are not interested in the exit status, but we should really collect
// that zombie process
go unxz.Wait()
return out, nil
}

Resources