Convert os.Stdin to []byte - go

I'm trying to implement a small chat-server in golang with end-to-end encryption. Starting of the example for server https://github.com/adonovan/gopl.io/tree/master/ch8/chat and client https://github.com/adonovan/gopl.io/blob/master/ch8/netcat3/netcat.go I stumbled upon https://www.thepolyglotdeveloper.com/2018/02/encrypt-decrypt-data-golang-application-crypto-packages/ to encrypt and decrypt in Go.
The function to encrypt:
func encrypt(data []byte, passphrase string) []byte {
block, _ := aes.NewCipher([]byte(createHash(passphrase)))
gcm, err := cipher.NewGCM(block)
if err != nil {
panic(err.Error())
}
nonce := make([]byte, gcm.NonceSize())
if _, err = io.ReadFull(rand.Reader, nonce); err != nil {
panic(err.Error())
}
ciphertext := gcm.Seal(nonce, nonce, data, nil)
return ciphertext
}
in func main():
ciphertext := encrypt([]byte(os.Stdin), "password")
mustCopy(conn, ciphertext)
conn.Close()
os.Stdin is os.file, while it is needed as []byte. The solution should be io.Reader or via buffer, but I can't find a working solution.
I tried
bytes.NewBuffer([]byte(os.Stdin))
and
reader := bytes.NewReader(os.Stdin)
Any input is more than welcome. Sorry if I'm not seeing the obvious problem/solution here, as I'm fairly new.

os.Stdin is an io.Reader. You can't convert it to a []byte, but you can read from it, and the data you read from it, that may be read into a []byte.
Since in many terminals reading from os.Stdin gives data by lines, you should read a complete line from it. Reading from os.Stdin might block until a complete line is available.
For that you have many possibilities, one is to use bufio.Scanner.
This is how you can do it:
scanner := bufio.NewScanner(os.Stdin)
if !scanner.Scan() {
log.Printf("Failed to read: %v", scanner.Err())
return
}
line := scanner.Bytes() // line is of type []byte, exactly what you need

Related

Processing data in chunks with io.ReadFull results in corrupted file?

I'm trying to download and decrypt HLS streams by using io.ReadFull to process the data in chunks to conserve memory:
Irrelevant parts of code has been left out for simplicity.
func main() {
f, _ := os.Create(out.ts)
for _, v := range mediaPlaylist {
resp, _ := http.Get(v.URI)
for {
r, err := decryptHLS(key, iv, resp.Body)
if err != nil && err == io.EOF {
break
else if err != nil && err != io.ErrUnexpectedEOF {
panic(err)
}
io.Copy(f, r)
}
}
}
func decryptHLS(key []byte, iv []byte, r io.Reader) (io.Reader, error) {
block, _ := aes.NewCipher(key)
buf := make([]byte, 8192)
mode := cipher.NewCBCDecrypter(block, iv)
n, err := io.ReadFull(r, buf)
if err != nil && err != io.ErrUnexpectedEOF {
return nil, err
}
mode.CryptBlocks(buf, buf)
return bytes.NewReader(buf[:n]), err
}
At first this seems to work as file size is correct and no errors during download,
but the video is corrupted. Not completely as the file is still recognized as a video, but image and sound is distorted.
If I change the code to use ioutil.ReadAll instead, the final video files will no longer be corrupted:
func main() {
f, _ := os.Create(out.ts)
for _, v := range mediaPlaylist {
resp, _ := http.Get(v.URI)
segment, _ := ioutil.ReadAll(resp.Body)
r, _ := decryptHLS(key, iv, &segment)
io.Copy(f, r)
}
}
func decryptHLS(key []byte, iv []byte, s *[]byte) io.Reader {
block, _ := aes.NewCipher(key)
mode := cipher.NewCBCDecrypter(block, iv)
mode.CryptBlocks(*s, *s)
return bytes.NewReader(*s)
}
Any ideas why it works correctly when reading the entire segment into memory, and not when using io.ReadFull and processing it in chunks?
Internally, CBCDecrypter makes a copy of your iv, so subsequent blocks start with the initial IV rather than the one that's been mutated by previous decryptions.
Create the decrypter once, and you should be able to keep re-using it to decrypt block by block (assuming the block size is a multiple of the block size expected by this crypto algorithm).

Best pattern to create sha256 from file and store file

I am writing a webserver that receives a file as an upload as multipart/form-data. I am generating the file sha256 from the request but due to the nature of the Reader interface, I can't reuse the data to also upload the file to a filer. These files can be a few hundred MBs. What is the best way to store the content? I can duplicate the contents but I am worried that could be wasteful on memory resources.
EDIT
func uploadFile(w http.ResponseWriter, r *http.Request) {
f, err := r.MultipartForm.File["capture"][0].Open()
if err != nil {
http.Error(w, err.Error(), http.StatusInternalServerError)
return
}
defer f.Close()
hash, err := createSha(f)
if err != nil {
fmt.Println(err.Error())
return
}
}
func createSha(image multipart.File) (hash.Hash, error) {
sha := sha256.New()
// This cause the contents of image to no longer be available to be read again to be stored on the filer
if _, err := io.Copy(sha, image); err != nil {
return nil, err
}
return sha, nil
}
You might use io.MultiWriter(...) to send the data to multiple output streams concurrently, such as a hash and some remote writer.
For example (roughly):
sha := sha256.New()
filer := filer.New(...) // Some Writer that stores the bytes for you?
err := io.Copy(io.MultiWriter(sha, filer), r)
// TODO: handle error
// Now sha.Sum(nil) has the file digest and "filer" got sent all the bytes.
Note that io.Multiwriter can take as many writers as you want, so you could compute additional hashes at the same time (e.g. md5, sha1, etc.) or even send the file to multiple locations, e.g.:
md5, sha1, sha256, sha512 := md5.New(), sha1.New(), sha256.New(), sha512.New()
s3Writer, gcsWriter := filer.NewS3Writer(), filer.NewGCSWriter()
mw := io.MultiWriter(awsWriter, gcsWriter, md5, sha1, sha256, sha512)
err := io.Copy(mw, r)
// TODO: handle error
// Now you've got all the hashes for the file and it's stored in the cloud.

Base64 encode/decode results in corrupted output

I'm trying to write some convenience wrapper funcs that base64 encodes and decodes byte slices. (Can't understand why this is not conveniently provided in the stdlib.)
However this code (in playground):
func b64encode(b []byte) []byte {
encodedData := &bytes.Buffer{}
encoder := base64.NewEncoder(base64.URLEncoding, encodedData)
defer encoder.Close()
encoder.Write(b)
return encodedData.Bytes()
}
func b64decode(b []byte) ([]byte, error) {
dec := base64.NewDecoder(base64.URLEncoding, bytes.NewReader(b))
buf := &bytes.Buffer{}
_, err := io.Copy(buf, dec)
if err != nil {
return nil, err
}
return buf.Bytes(), nil
}
func main() {
b := []byte("hello")
e := b64encode(b)
d, err := b64decode(e)
if err != nil {
log.Fatalf("could not decode: %s", err)
}
fmt.Println(string(d))
}
generates truncated output when I try to print it:
hel
What's going on?
The defer executes when the function ends. That is AFTER the return statement has been evaluated.
The following works: https://play.golang.org/p/sYn-W6fZh1
func b64encode(b []byte) []byte {
encodedData := &bytes.Buffer{}
encoder := base64.NewEncoder(base64.URLEncoding, encodedData)
encoder.Write(b)
encoder.Close()
return encodedData.Bytes()
}
That being said, if it really is all in memory, you can avoid creating an encoder entirely. Instead, you can do something like:
func b64encode(b []byte) []byte {
ret := make([]byte, base64.URLEncoding.EncodedLen(len(b)))
base64.URLEncoding.Encode(ret, b)
return ret
}
An added benefit of doing it this way it it is more efficient since it only needs to allocate once. It also allows you to no longer ignore errors in the Write and Close methods.

Read contents from what io.Writer writes

There's a library that exports a file but I'd like to capture the contents of the file. I'd like to pass a writer to the library and be able to read what the writer wrote to the file. Eventually i want to augment the library to skip writing this file.
Is this possible with io.Copy or io.Pipe?
The library code creates a *File and uses this handle as an io.Writer.
I tried using io.Copy but only 0 bytes were read.
func TestFileCopy(t *testing.T) {
codeFile, err := os.Create("test.txt")
if err != nil {
t.Error(err)
}
defer codeFile.Close()
codeFile.WriteString("Hello World")
n, err := io.Copy(os.Stdout, codeFile)
if err != nil {
t.Error(err)
}
log.Println(n, "bytes")
}
If you want to capture the bytes as they are written, use an io.MultiWriter with a bytes.Buffer as the second writer.
var buf bytes.Buffer
w := io.MultiWriter(codeFile, &buf)
or to see the file on stdout as it's written:
w := io.MultiWriter(codeFile, os.Stdout)
Otherwise, if you want to re-read the same file, you need to seek back to the start after writing:
codeFile.Seek(0, 0)

Download a zip file using io.Pipe() read/write golang

I am trying to stream out bytes of a zip file using io.Pipe() function in golang. I am using pipe reader to read the bytes of each file in the zip and then stream those out and use the pipe writer to write the bytes in the response object.
func main() {
r, w := io.Pipe()
// go routine to make the write/read non-blocking
go func() {
defer w.Close()
bytes, err := ReadBytesforEachFileFromTheZip()
err := json.NewEncoder(w).Encode(bytes)
handleErr(err)
}()
This is not a working implementation but a structure of what I am trying to achieve. I don't want to use ioutil.ReadAll since the file is going to be very large and Pipe() will help me avoid bringing all the data into memory. Can someone help with a working implementation using io.Pipe() ?
I made it work using golang io.Pipe().The Pipewriter writes byte to the pipe in chunks and the pipeReader reader from the other end. The reason for using a go-routine is to have a non-blocking write operation while simultaneous reads happen form the pipe.
Note: It's important to close the pipe writer (w.Close()) to send EOF on the stream otherwise it will not close the stream.
func DownloadZip() ([]byte, error) {
r, w := io.Pipe()
defer r.Close()
defer w.Close()
zip, err := os.Stat("temp.zip")
if err != nil{
return nil, err
}
go func(){
f, err := os.Open(zip.Name())
if err != nil {
return
}
buf := make([]byte, 1024)
for {
chunk, err := f.Read(buf)
if err != nil && err != io.EOF {
panic(err)
}
if chunk == 0 {
break
}
if _, err := w.Write(buf[:chunk]); err != nil{
return
}
}
w.Close()
}()
body, err := ioutil.ReadAll(r)
if err != nil {
return nil, err
}
return body, nil
}
Please let me know if someone has another way of doing it.

Resources