Golang AES encryption - go

I am new to Go and I am trying out the crypto package.
My code looks like:
package main
import "fmt"
import . "crypto/aes"
func main() {
block, _ := NewCipher([]byte("randomkey"))
var dst = []byte{}
var src = []byte("senstive")
block.Encrypt(dst, src)
fmt.Println(string(src))
}
I get the following error:
panic: runtime error: invalid memory address or nil pointer dereference.
What am I doing wrong?
My code can be found at the Go playground
here

I fixed it:
package main
import "fmt"
import "crypto/aes"
func main() {
bc, err := aes.NewCipher([]byte("key3456789012345"))
if (err != nil) {
fmt.Println(err);
}
fmt.Printf("The block size is %d\n", bc.BlockSize())
var dst = make([]byte, 16)
var src = []byte("sensitive1234567")
bc.Encrypt(dst, src)
fmt.Println(dst)
}
In general, you should be checking error codes and carefully reading documentation of every function you call. Also, this is a block cypher so it requires blocks of bytes that are a specific size.

Related

How do I check the size of a Go project?

Is there an easy way to check the size of a Golang project? It's not an executable, it's a package that I'm importing in my own project.
You can see how big the library binaries are by looking in the $GOPATH/pkg directory (if $GOPATH is not exported go defaults to $HOME/go).
So to check the size of some of the gorilla http pkgs. Install them first:
$ go get -u github.com/gorilla/mux
$ go get -u github.com/gorilla/securecookie
$ go get -u github.com/gorilla/sessions
The KB binary sizes on my 64-bit MacOS (darwin_amd64):
$ cd $GOPATH/pkg/darwin_amd64/github.com/gorilla/
$ du -k *
284 mux.a
128 securecookie.a
128 sessions.a
EDIT:
Library (package) size is one thing, but how much space that takes up in your executable after the link stage can vary wildly. This is because packages have their own dependencies and with that comes extra baggage, but that baggage may be shared by other packages you import.
An example demonstrates this best:
empty.go:
package main
func main() {}
http.go:
package main
import "net/http"
var _ = http.Serve
func main() {}
mux.go:
package main
import "github.com/gorilla/mux"
var _ = mux.NewRouter
func main() {}
All 3 programs are functionally identical - executing zero user code - but their dependencies differ. The resulting binary sizes in KB:
$ du -k *
1028 empty
5812 http
5832 mux
What does this tell us? The core go pkg net/http adds significant size to our executable. The mux pkg is not large by itself, but it has an import dependency on net/http pkg - hence the significant file size for it too. Yet the delta between mux and http is only 20KB, whereas the listed file size of the mux.a library is 284KB. So we can't simply add the library pkg sizes to determine their true footprint.
Conclusion:
The go linker will strip out a lot of baggage from individual libraries during the build process, but in order to get a true sense of how much extra weight importing certain packages, one has to look at all of the pkg's sub-dependencies as well.
Here is another solution that makes use of https://pkg.go.dev/golang.org/x/tools/go/packages
I took the example provided by the author, and slightly updated it with the demonstration binary available here.
package main
import (
"flag"
"fmt"
"log"
"os"
"sort"
"golang.org/x/tools/go/packages"
)
func main() {
flag.Parse()
// Many tools pass their command-line arguments (after any flags)
// uninterpreted to packages.Load so that it can interpret them
// according to the conventions of the underlying build system.
cfg := &packages.Config{Mode: packages.NeedFiles |
packages.NeedSyntax |
packages.NeedImports,
}
pkgs, err := packages.Load(cfg, flag.Args()...)
if err != nil {
fmt.Fprintf(os.Stderr, "load: %v\n", err)
os.Exit(1)
}
if packages.PrintErrors(pkgs) > 0 {
os.Exit(1)
}
// Print the names of the source files
// for each package listed on the command line.
var size int64
for _, pkg := range pkgs {
for _, file := range pkg.GoFiles {
s, err := os.Stat(file)
if err != nil {
log.Println(err)
continue
}
size += s.Size()
}
}
fmt.Printf("size of %v is %v b\n", pkgs[0].ID, size)
size = 0
for _, pkg := range allPkgs(pkgs) {
for _, file := range pkg.GoFiles {
s, err := os.Stat(file)
if err != nil {
log.Println(err)
continue
}
size += s.Size()
}
}
fmt.Printf("size of %v and deps is %v b\n", pkgs[0].ID, size)
}
func allPkgs(lpkgs []*packages.Package) []*packages.Package {
var all []*packages.Package // postorder
seen := make(map[*packages.Package]bool)
var visit func(*packages.Package)
visit = func(lpkg *packages.Package) {
if !seen[lpkg] {
seen[lpkg] = true
// visit imports
var importPaths []string
for path := range lpkg.Imports {
importPaths = append(importPaths, path)
}
sort.Strings(importPaths) // for determinism
for _, path := range importPaths {
visit(lpkg.Imports[path])
}
all = append(all, lpkg)
}
}
for _, lpkg := range lpkgs {
visit(lpkg)
}
return all
}
You can download all the imported modules with go mod vendor, then count the lines of all the .go files that aren't test files:
package main
import (
"bytes"
"fmt"
"io/fs"
"os"
"os/exec"
"path/filepath"
"strings"
)
func count(mod string) int {
imp := fmt.Sprintf("package main\nimport _ %q", mod)
os.WriteFile("size.go", []byte(imp), os.ModePerm)
exec.Command("go", "mod", "init", "size").Run()
exec.Command("go", "mod", "vendor").Run()
var count int
filepath.WalkDir("vendor", func(s string, d fs.DirEntry, err error) error {
if strings.HasSuffix(s, ".go") && !strings.HasSuffix(s, "_test.go") {
data, err := os.ReadFile(s)
if err != nil {
return err
}
count += bytes.Count(data, []byte{'\n'})
}
return nil
})
return count
}
func main() {
println(count("github.com/klauspost/compress/zstd"))
}

How to read packed binary data in Go?

I'm trying to figure out the best way to read a packed binary file in Go that was produced by Python like the following:
import struct
f = open('tst.bin', 'wb')
fmt = 'iih' #please note this is packed binary: 4byte int, 4byte int, 2byte int
f.write(struct.pack(fmt,4, 185765, 1020))
f.write(struct.pack(fmt,4, 185765, 1022))
f.close()
I have been tinkering with some of the examples I've seen on Github.com and a few other sources but I can't seem to get anything working correctly (update shows working method). What is the idiomatic way to do this sort of thing in Go? This is one of several attempts
UPDATE and WORKING
package main
import (
"fmt"
"os"
"encoding/binary"
"io"
)
func main() {
fp, err := os.Open("tst.bin")
if err != nil {
panic(err)
}
defer fp.Close()
lineBuf := make([]byte, 10) //4 byte int, 4 byte int, 2 byte int per line
for true {
_, err := fp.Read(lineBuf)
if err == io.EOF{
break
}
aVal := int32(binary.LittleEndian.Uint32(lineBuf[0:4])) // same as: int32(uint32(b[0]) | uint32(b[1])<<8 | uint32(b[2])<<16 | uint32(b[3])<<24)
bVal := int32(binary.LittleEndian.Uint32(lineBuf[4:8]))
cVal := int16(binary.LittleEndian.Uint16(lineBuf[8:10])) //same as: int16(uint32(b[0]) | uint32(b[1])<<8)
fmt.Println(aVal, bVal, cVal)
}
}
A well portable and rather easy way to handle the problem are Google's "Protocol Buffers". Though this is too late now since you got it working, I took some effort in explaining and coding it, so I am posting it anyway.
You can find the code on https://github.com/mwmahlberg/ProtoBufDemo
You need to install the protocol buffers for python using your preferred method (pip, OS package management, source) and for Go
The .proto file
The .proto file is rather simple for our example. I called it data.proto
syntax = "proto2";
package main;
message Demo {
required uint32 A = 1;
required uint32 B = 2;
// A shortcomning: no 16 bit ints
// We need to make this sure in the applications
required uint32 C = 3;
}
Now you need to call protoc on the file and have it provide the code for both Python and Go:
protoc --go_out=. --python_out=. data.proto
which generates the files data_pb2.py and data.pb.go. Those files provide the language specific access to the protocol buffer data.
When using the code from github, all you need to do is to issue
go generate
in the source directory.
The Python code
import data_pb2
def main():
# We create an instance of the message type "Demo"...
data = data_pb2.Demo()
# ...and fill it with data
data.A = long(5)
data.B = long(5)
data.C = long(2015)
print "* Python writing to file"
f = open('tst.bin', 'wb')
# Note that "data.SerializeToString()" counterintuitively
# writes binary data
f.write(data.SerializeToString())
f.close()
f = open('tst.bin', 'rb')
read = data_pb2.Demo()
read.ParseFromString(f.read())
f.close()
print "* Python reading from file"
print "\tDemo.A: %d, Demo.B: %d, Demo.C: %d" %(read.A, read.B, read.C)
if __name__ == '__main__':
main()
We import the file generated by protoc and use it. Not much magic here.
The Go File
package main
//go:generate protoc --python_out=. data.proto
//go:generate protoc --go_out=. data.proto
import (
"fmt"
"os"
"github.com/golang/protobuf/proto"
)
func main() {
// Note that we do not handle any errors for the sake of brevity
d := Demo{}
f, _ := os.Open("tst.bin")
fi, _ := f.Stat()
// We create a buffer which is big enough to hold the entire message
b := make([]byte,fi.Size())
f.Read(b)
proto.Unmarshal(b, &d)
fmt.Println("* Go reading from file")
// Note the explicit pointer dereference, as the fields are pointers to a pointers
fmt.Printf("\tDemo.A: %d, Demo.B: %d, Demo.C: %d\n",*d.A,*d.B,*d.C)
}
Note that we do not need to explicitly import, as the package of data.proto is main.
The result
After generation the required files and compiling the source, when you issue
$ python writer.py && ./ProtoBufDemo
the result is
* Python writing to file
* Python reading from file
Demo.A: 5, Demo.B: 5, Demo.C: 2015
* Go reading from file
Demo.A: 5, Demo.B: 5, Demo.C: 2015
Note that the Makefile in the repository offers a shorcut for generating the code, compiling the .go files and run both programs:
make run
The Python format string is iih, meaning two 32-bit signed integers and one 16-bit signed integer (see the docs). You can simply use your first example but change the struct to:
type binData struct {
A int32
B int32
C int16
}
func main() {
fp, err := os.Open("tst.bin")
if err != nil {
panic(err)
}
defer fp.Close()
for {
thing := binData{}
err := binary.Read(fp, binary.LittleEndian, &thing)
if err == io.EOF{
break
}
fmt.Println(thing.A, thing.B, thing.C)
}
}
Note that the Python packing didn't specify the endianness explicitly, but if you're sure the system that ran it generated little-endian binary, this should work.
Edit: Added main() function to explain what I mean.
Edit 2: Capitalized struct fields so binary.Read could write into them.
As I mentioned in my post, I'm not sure this is THE idiomatic way to do this in Go but this is the solution that I came up with after a fair bit of tinkering and adapting several different examples. Note again that this unpacks 4 and 2 byte int into Go int32 and int16 respectively. Posting so that there is a valid answer in case someone comes looking. Hopefully someone will post a more idiomatic way of accomplishing this but for now, this works.
package main
import (
"fmt"
"os"
"encoding/binary"
"io"
)
func main() {
fp, err := os.Open("tst.bin")
if err != nil {
panic(err)
}
defer fp.Close()
lineBuf := make([]byte, 10) //4 byte int, 4 byte int, 2 byte int per line
for true {
_, err := fp.Read(lineBuf)
if err == io.EOF{
break
}
aVal := int32(binary.LittleEndian.Uint32(lineBuf[0:4])) // same as: int32(uint32(b[0]) | uint32(b[1])<<8 | uint32(b[2])<<16 | uint32(b[3])<<24)
bVal := int32(binary.LittleEndian.Uint32(lineBuf[4:8]))
cVal := int16(binary.LittleEndian.Uint16(lineBuf[8:10])) //same as: int16(uint32(b[0]) | uint32(b[1])<<8)
fmt.Println(aVal, bVal, cVal)
}
}
Try binpacker libary.
Example:
Example data:
buffer := new(bytes.Buffer)
packer := binpacker.NewPacker(buffer)
unpacker := binpacker.NewUnpacker(buffer)
packer.PushByte(0x01)
packer.PushUint16(math.MaxUint16)
Unpack:
var val1 byte
var val2 uint16
var err error
val1, err = unpacker.ShiftByte()
val2, err = unpacker.ShiftUint16()
Or:
var val1 byte
var val2 uint16
var err error
unpacker.FetchByte(&val1).FetchUint16(&val2)
unpacker.Error() // Make sure error is nil

Difference between []uint8 && []byte (Golang Slices)

One of the functions I am running: image.Decode()
The image.Decode function takes in an io.Reader && and the io.Reader function takes in a []byte.
When I pass in a []uint8, if gives me this error:
panic: image: unknown format
How do I convert the []uint8 to []byte?
UPDATE
The error is happening at the starred area because image.Decode can't read the variable xxx.
package main
import (
"github.com/nfnt/resize"
"image"
"image/jpeg"
"fmt"
"launchpad.net/goamz/aws"
"launchpad.net/goamz/s3"
"bytes"
"encoding/json"
"io/ioutil"
"os"
"reflect"
)
type Data struct {
Key string
}
func main() {
useast := aws.USEast
connection := s3.New(auth, useast)
mybucket := connection.Bucket("bucketName")
image_data, err := mybucket.Get("1637563605030")
if err != nil {
panic(err.Error())
} else {
fmt.Println("success")
}
xxx := []byte(image_data)
******* THIS IS WHERE THE ERROR OCCURS **************
original_image, _, err := image.Decode(bytes.NewReader(xxx))
******* THIS IS WHERE THE ERROR OCCURS END **************
if err != nil {
fmt.Println("Shit")
panic(err.Error())
} else {
fmt.Println("Another success")
}
new_image := resize.Resize(160, 0, original_image, resize.Lanczos3)
if new_image != nil {
fmt.Println("YAY")
}
}
The Go Programming Language Specification
Numeric types
uint8 the set of all unsigned 8-bit integers (0 to 255)
byte alias for uint8
package main
import "fmt"
func ByteSlice(b []byte) []byte { return b }
func main() {
b := []byte{0, 1}
u8 := []uint8{2, 3}
fmt.Printf("%T %T\n", b, u8)
fmt.Println(ByteSlice(b))
fmt.Println(ByteSlice(u8))
}
Output:
[]uint8 []uint8
[0 1]
[2 3]
You have misdiagnosed your problem.
As the other answers have explained, there's no problem passing a []uint8 where a []byte is required. If this was your problem, you'd be getting a compile time error. You aren't. A panic is a runtime error, and it's being thrown by the image library when it reads the data in the slice.
In fact, the image library is only partially your problem. See http://golang.org/src/pkg/image/format.go. It's returning an error message because it doesn't recognize the image format of the data in the slice. Your code, which calls image.Decode() is calling panic when image.Decode() returns the error message.
If you have a variable imageData that is []uint8 you may pass []byte(imageData)
See http://golang.org/ref/spec#Conversions

How to scan a big.Int from standard input in Go

Is there a way to scan a big.Int directly from the standard input in Go? Right now I'm doing this:
package main
import (
"fmt"
"math/big"
)
func main() {
w := new(big.Int)
var s string
fmt.Scan(&s)
fmt.Sscan(s, w)
fmt.Println(w)
}
I also could have used .SetString. But, is there a way to Scan the big.Int directly from the standard input without scanning a string or an integer first?
For example,
package main
import (
"fmt"
"math/big"
)
func main() {
w := new(big.Int)
n, err := fmt.Scan(w)
fmt.Println(n, err)
fmt.Println(w.String())
}
Input (stdin):
295147905179352825857
Output (stdout):
1 <nil>
295147905179352825857
As far as I know - no, there's no other way. In fact, what you've got is the default example they have for scanning big.Int in the documentation.
package main
import (
"fmt"
"log"
"math/big"
)
func main() {
// The Scan function is rarely used directly;
// the fmt package recognizes it as an implementation of fmt.Scanner.
i := new(big.Int)
_, err := fmt.Sscan("18446744073709551617", i)
if err != nil {
log.Println("error scanning value:", err)
} else {
fmt.Println(i)
}
}
You can see the relevant section here - http://golang.org/pkg/math/big/#Int.Scan

How to modify printer.Fprint in old version to can run on latest version of Go

str := new(bytes.Buffer) //old code
printer.Fprint(str, c) //old code
str := new(token.FileSet) //new code
printer.Fprint(os.Stdout, str, c) //new code
source += "\t" + str.String() + ";\n"
In this code i try to change str's value from new(bytes.Buffer) to new(token.FileSet) because Fprint's argument requier;
func Fprint(output io.Writer, fset *token.FileSet, node interface{}) os.Error //latest ver.
now, i'm stucking in error str.String() because str don't have method String().
I cann't update my code for run in latest version of Go because a changed of printer.Fprint()
How to volve this?
Here's a sample program.
package main
import (
"bytes"
"fmt"
"go/parser"
"go/printer"
"go/token"
)
func main() {
const src = `package main
func main() {}
`
fset := token.NewFileSet()
ast, err := parser.ParseFile(fset, "", src, parser.ParseComments)
if err != nil {
panic(err)
}
var buf bytes.Buffer
printer.Fprint(&buf, fset, ast)
fmt.Print(buf.String())
}
Output:
package main
func main() {}

Resources