I'm trying to convert an image to grayscale using Go.
I've found the below code, however, I'm struggling to understand it.
It would be extremely helpful if you could explain what each function is doing and where to define the incoming and outgoing file.
package main
import (
"image"
_ "image/jpeg" // Register JPEG format
"image/png" // Register PNG format
"image/color"
"log"
"os"
)
// Converted implements image.Image, so you can
// pretend that it is the converted image.
type Converted struct {
Img image.Image
Mod color.Model
}
// We return the new color model...
func (c *Converted) ColorModel() color.Model{
return c.Mod
}
// ... but the original bounds
func (c *Converted) Bounds() image.Rectangle{
return c.Img.Bounds()
}
// At forwards the call to the original image and
// then asks the color model to convert it.
func (c *Converted) At(x, y int) color.Color{
return c.Mod.Convert(c.Img.At(x,y))
}
func main() {
if len(os.Args) != 3 { log.Fatalln("Needs two arguments")}
infile, err := os.Open(os.Args[1])
if err != nil {
log.Fatalln(err)
}
defer infile.Close()
img, _, err := image.Decode(infile)
if err != nil {
log.Fatalln(err)
}
// Since Converted implements image, this is now a grayscale image
gr := &Converted{img, color.GrayModel}
// Or do something like this to convert it into a black and
// white image.
// bw := []color.Color{color.Black,color.White}
// gr := &Converted{img, color.Palette(bw)}
outfile, err := os.Create(os.Args[2])
if err != nil {
log.Fatalln(err)
}
defer outfile.Close()
png.Encode(outfile,gr)
}
I'm quite new to Go so any suggestions or help would be appreciated.
So as Atomic_alarm pointed out, https://maxhalford.github.io/blog/halftoning-1/ explains how to do this succinctly.
But you're question, if I understand correctly, is about the file opening and creation?
The first step is to use the image package to Decode the opened file into an image.Image struct:
infile, err := os.Open("fullcolor.png")
if err != nil {
return nil, err
}
defer infile.Close()
img, _, err := image.Decode(infile) // img -> image.Image
if err != nil {
return nil, err
}
With this Go image.Image struct, you can convert it to a grayscaled image, image.Gray and then, finally, write or encode the image onto an outgoing file on the disk:
outfile, _ := os.Create("grayscaled.png")
defer outfile.Close()
png.Encode(outfile, grayscaledImage) // grayscaledImage -> image.Gray
Inbetween the infile opening and outfile creating, you have to, of course, convert the image to grayscale. Again, try the link above, and you'll find this function, which takes an image.Image and returns a pointer to a image.Gray:
func rgbaToGray(img image.Image) *image.Gray {
var (
bounds = img.Bounds()
gray = image.NewGray(bounds)
)
for x := 0; x < bounds.Max.X; x++ {
for y := 0; y < bounds.Max.Y; y++ {
var rgba = img.At(x, y)
gray.Set(x, y, rgba)
}
}
return gray
}
Concerning the code you provided (and your comment), you were opening a file with os.Args[1], and creating the file os.Args[2]. os.Args is a slice of arguments passed when running the program, 0 will always be the program itself (main), and whatever follows will with 1, 2, etc. The docs states:
Args hold the command-line arguments, starting with the program name.
var Args []string
so you would run your code above like this:
$ go run main.go infile.png outfile.png
infile.png must to be a file on disk (inside the directory you are running the code from, or the complete path to file).
What I have provide above doesn't use os.Args but rather hard codes the file names into the program.
Related
For some jpeg image, the EOI is not ending with \xff\xd9, my example I see \xff\x00, so I am trying to fix this using go.
f, _ := os.Open("bad.jpeg")
img, _, err := image.Decode(f)
if err != nil {
fmt.Println(err)
}
fmt.Println("successfully decoded")
opt := jpeg.Options{
Quality: 100,
}
f1, _ := os.Create("good.jpeg")
jpeg.Encode(f1, img, &opt)
however, image.Decode(f) failed due to unexpected EOF, I would like to know how to fix the ending problem for bad formatted jpeg file.
With Python, I can simply do the following, open and save will automatically fix the EOI for me, any equivalent way in Go?
from PIL import Image
im = Image.open("bad.jpeg")
im.save("good.jpeg", quality=100)
here is the image I am testing
Here is a fairly naive solution that only works for this very specific case:
read the file, try to decode it. If it fails to decode, check the last two bytes and overwrite the last one if it's a known pattern. Try to decode it again. If successful, write the fixed bytes to the new file.
package main
import (
"bytes"
"image"
_ "image/jpeg"
"io/ioutil"
)
func main() {
contents, err := ioutil.ReadFile("bad.jpeg")
if err != nil {
panic(err)
}
buffer := bytes.NewBuffer(contents)
_, _, err = image.Decode(buffer)
if err == nil {
return
}
if err.Error() != "unexpected EOF" {
panic(err)
}
// Maybe wrong End-Of-Image.
if contents[len(contents)-1] == '\x00' && contents[len(contents)-2] == '\xff' {
contents[len(contents)-1] = '\xd9'
} else {
panic("don't know what to do")
}
// Reset buffer and decode again.
buffer = bytes.NewBuffer(contents)
_, _, err = image.Decode(buffer)
if err != nil {
panic(err)
}
// Write fixed buffer to the new file.
err = ioutil.WriteFile("good.jpeg", contents, 0644)
if err != nil {
panic(err)
}
}
I am trying to use the go-skeltrack library with some depth images I have (Not using freenect). For that I need to modify the provided example by replacing the kinect images by my own. For that I have to read an image and convert it later to an []uint16 variable. The code which I tried is:
file, err := os.Open("./images/4.png")
if err != nil {
fmt.Println("4.png file not found!")
os.Exit(1)
}
defer file.Close()
fileInfo, _ := file.Stat()
var size int64 = fileInfo.Size()
bytes := make([]byte, size)
// read file into bytes
buffer := bufio.NewReader(file)
_, err = buffer.Read(bytes)
integerImage := binary.BigEndian.Uint16(bytes)
onDepthFrame(integerImage)
Where onDepthFrame is a function which has the form
func onDepthFrame(depth []uint16).
But I am getting the following error while compiling:
./skeltrackOfflineImage.go:155: cannot use integerImage (type uint16) as type []uint16 in argument to onDepthFrame
Which of course refers to the fact that I generated a single integer instead of an array. I am quite confused about the way that Go data types conversion works. Please help!
Thanks in advance for your help.
Luis
binary.BigEndian.Uint16 converts two bytes (in a slice) to a 16-bit value using big endian byte order. If you want to convert bytes to a slice of uint16, you should use binary.Read:
// This reads 10 uint16s from file.
slice := make([]uint16, 10)
err := binary.Read(file, binary.BigEndian, slice)
It sounds like you're looking to get raw pixels. If that's the case, I don't recommend reading the file as binary directly. It means you would need to parse the file format yourself since image files contain more information than just the raw pixel values. There are already tools in the image package to deal with that.
This code should get you on the right track. It reads RGBA values, so it ends up with a 1D array of uint8's of length width * height * 4, since there are four values per pixel.
https://play.golang.org/p/WUgHQ3pRla
import (
"bufio"
"fmt"
"image"
"os"
// for decoding png files
_ "image/png"
)
// RGBA attempts to load an image from file and return the raw RGBA pixel values.
func RGBA(path string) ([]uint8, error) {
file, err := os.Open(path)
if err != nil {
return nil, err
}
img, _, err := image.Decode(bufio.NewReader(file))
if err != nil {
return nil, err
}
switch trueim := img.(type) {
case *image.RGBA:
return trueim.Pix, nil
case *image.NRGBA:
return trueim.Pix, nil
}
return nil, fmt.Errorf("unhandled image format")
}
I'm not entirely sure where the uint16 values you need should come from, but presumably it's data per pixel, so the code should be very similar to this except the switch on trueim should likely check for something other than image.RGBA. Take a look at the other image types in https://golang.org/pkg/image
I'm struggling to handle nested zip files in Go (where a zip file contains another zip file). I'm trying to recurse a zip file and list all of the files it contains.
archive/zip gives you two methods for handling a zip file:
zip.NewReader
zip.OpenReader
OpenReader opens a file on disk. NewReader accepts an io.ReaderAt and a file size. As you iterate through the zipped files with either of these, you get out a zip.File for each file inside the zip. To get the file contents of file f, you call f.Open which gives you a zip.ReadCloser. To open a nested zip file, I'd need to use NewReader, but zip.File and zip.ReadCloser do not satisfy the io.ReaderAt interface.
zip.File has a private field zipr which is an io.ReaderAt and zip.ReadCloser has a private field f which is an os.File which should satisfy the requirements for NewReader.
My question: is there any way to open a nested zip file without first writing the contents to a file on disk, or reading the whole thing into memory.
It looks like everything that is needed is available in zip.File, but isn't exported. I'm hoping I missed something.
How about an io.ReaderAt from an io.Reader that reinitializes if you decided to go backwards: (this code is largely untested, but hopefully you get the idea)
package main
import (
"io"
"io/ioutil"
"os"
"strings"
)
type inefficientReaderAt struct {
rdr io.ReadCloser
cur int64
initer func() (io.ReadCloser, error)
}
func newInefficentReaderAt(initer func() (io.ReadCloser, error)) *inefficientReaderAt {
return &inefficientReaderAt{
initer: initer,
}
}
func (r *inefficientReaderAt) Read(p []byte) (n int, err error) {
n, err = r.rdr.Read(p)
r.cur += int64(n)
return n, err
}
func (r *inefficientReaderAt) ReadAt(p []byte, off int64) (n int, err error) {
// reset on rewind
if off < r.cur || r.rdr == nil {
r.cur = 0
r.rdr, err = r.initer()
if err != nil {
return 0, err
}
}
if off > r.cur {
sz, err := io.CopyN(ioutil.Discard, r.rdr, off-r.cur)
n = int(sz)
if err != nil {
return n, err
}
}
return r.Read(p)
}
func main() {
r := newInefficentReaderAt(func() (io.ReadCloser, error) {
return ioutil.NopCloser(strings.NewReader("ABCDEFG")), nil
})
io.Copy(os.Stdout, io.NewSectionReader(r, 0, 3))
io.Copy(os.Stdout, io.NewSectionReader(r, 1, 3))
}
If you mostly move forwards this probably works ok. Especially if you use a buffered reader.
I should note that this violates the io.ReaderAt guarantees: https://godoc.org/io#ReaderFrom , namely it doesn't allow parallel calls to ReadAt, and doesn't block on full reads, so this may not even work properly
I ran into the exact same need and came up with the following approach, not sure if its any help to you:
// NewZipFromReader ...
func NewZipFromReader(file io.ReadCloser, size int64) (*zip.Reader, error) {
in := file.(io.Reader)
if _, ok := in.(io.ReaderAt); ok != true {
buffer, err := ioutil.ReadAll(in)
if err != nil {
return nil, err
}
in = bytes.NewReader(buffer)
size = int64(len(buffer))
}
reader, err := zip.NewReader(in.(io.ReaderAt), size)
if err != nil {
return nil, err
}
return reader, nil
}
So if file doesn't implement io.ReaderAt it reads the whole contents into a buffer.
It's probably not safe to handle ZIP bombs, and will defenitely fail with OOM for files larger than RAM.
How can I split gif into images in go?
image/gif's DecodeAll return GIF, which contains an array of palette. But don't know how to convert each palette into an image?
Consider the following:
Frames can contain transparent pixels or areas, a good example is this image on wikipedia which (I guess) has one of these full-color blocks per frame and the rest of the frame transparent.
This introduces a problem for you: Especially with animated GIFs, that do not use multiple frames to create a true-colored static image, the frames that DecodeAll returns are not what you actually see if you, for example, open the image in your browser.
You'll have to process the image in the same way your browser would, i.e. leave the old frames on a kind of canvas and overpaint with the new frame. BUT this is not always true. GIF frames can, AFAIK, contain a disposal method, specifying how (or if?) you should dispose of the frame.
Anyways, to get to your point, the most simple approach that will also work in most cases is something like
import (
"errors"
"fmt"
"image"
"image/draw"
"image/gif"
"image/png"
"io"
"os"
)
// Decode reads and analyzes the given reader as a GIF image
func SplitAnimatedGIF(reader io.Reader) (err error) {
defer func() {
if r := recover(); r != nil {
err = fmt.Errorf("Error while decoding: %s", r)
}
}()
gif, err := gif.DecodeAll(reader)
if err != nil {
return err
}
imgWidth, imgHeight := getGifDimensions(gif)
overpaintImage := image.NewRGBA(image.Rect(0, 0, imgWidth, imgHeight))
draw.Draw(overpaintImage, overpaintImage.Bounds(), gif.Image[0], image.ZP, draw.Src)
for i, srcImg := range gif.Image {
draw.Draw(overpaintImage, overpaintImage.Bounds(), srcImg, image.ZP, draw.Over)
// save current frame "stack". This will overwrite an existing file with that name
file, err := os.Create(fmt.Sprintf("%s%d%s", "<some path>", i, ".png"))
if err != nil {
return err
}
err = png.Encode(file, overpaintImage)
if err != nil {
return err
}
file.Close()
}
return nil
}
func getGifDimensions(gif *gif.GIF) (x, y int) {
var lowestX int
var lowestY int
var highestX int
var highestY int
for _, img := range gif.Image {
if img.Rect.Min.X < lowestX {
lowestX = img.Rect.Min.X
}
if img.Rect.Min.Y < lowestY {
lowestY = img.Rect.Min.Y
}
if img.Rect.Max.X > highestX {
highestX = img.Rect.Max.X
}
if img.Rect.Max.Y > highestY {
highestY = img.Rect.Max.Y
}
}
return highestX - lowestX, highestY - lowestY
}
(untested, but should work)
Note that gif.DecodeAll can and will panic frequently, because a lot of the GIF images on the internet are somewhat broken. Your browser tries to decode them and will, for example, replace missing colors with black. image/gif will not do that, but panic instead. That's why we defer the recover.
Also, I used the getGifDimensions for a similar reason as stated above: single frames need not be what you see in your browser. In this case, the frames are just smaller than the complete image, that's why we have to iterate over all frames and get the "true" dimensions of the image.
If you really really want to do it right, you should probably read the GIF spec GIF87a, GIF89a and something like this article which is a lot easier to understand. From that, you should decide how to dispose of the frames and what to do with transparency while overpainting.
EDIT: Some of the effects mentioned earlier can be observed easily if you split some GIFs online, for example this or this - play around with "Ignore optimizations" and "Redraw every frame with details from previous frames" to see what I mean.
image.Image is an interface, and *image.Paletted implements the interface, so for example if you want to save every frame of a GIF into a PNG file, you can just encode every image:
for i, frame := range img.Image {
frameFile, err := os.OpenFile(fmt.Sprintf("%d.png", i+1), os.O_CREATE|os.O_TRUNC|os.O_WRONLY, 0666)
if err != nil {
log.Fatal(err)
}
err = png.Encode(frameFile, frame)
if err != nil {
log.Fatal(err)
}
// Not using defer here because we're in a loop, not a function.
frameFile.Close()
}
I am trying to parse a file that annoying consists of many separately zipped segments. I have parsed these segments one at a time into a slice of bytes and I want to uncompress them as I go.
Here is my current code that does the decompressing, which doesn't work. from and to are just set at the top as an example, in reality they are set by the code. data is the byte array containing the entire file. I don't want to seek it while it's on disk because its location on another server, so it's only realistic for me to load the entire file to []byte first and then parse it.
from, to := 0, 1000;
b := bytes.NewReader(data[from:from+to])
z, err := zlib.NewReader(b)
CheckErr(err)
defer z.Close()
p := make([]byte,0,1024)
z.Read(p)
fmt.Println(string(p))
So how is it so massively difficult just to unzip a slice of bytes? Anyway...
The problem appears to with how I am reading it out. Where it says z.Read, that doesn't seem to do anything.
How can I read the entire thing in one go into a slice of bytes?
Here's an outline for you. Note: In Go, CHECK FOR ERRORS!
package main
import (
"bytes"
"compress/zlib"
"fmt"
"io/ioutil"
)
func readSegment(data []byte, from, to int) ([]byte, error) {
b := bytes.NewReader(data[from : from+to])
z, err := zlib.NewReader(b)
if err != nil {
return nil, err
}
defer z.Close()
p, err := ioutil.ReadAll(z)
if err != nil {
return nil, err
}
return p, nil
}
func main() {
from, to := 0, 1000
data := make([]byte, from+to)
// ** parse input segments into data **
p, err := readSegment(data, from, to)
if err != nil {
fmt.Println(err)
return
}
fmt.Println(string(p))
}
Use ReadAll(r io.Reader) ([]byte, error) from the io/ioutil package.
p, err := ioutil.ReadAll(b)
fmt.Println(string(p))
Read only reads up to the length of the given slice (1024 bytes in your case).
To read in chunks of 1024 bytes:
p := make([]byte,1024)
for {
numBytes, err := l.Read(p)
if err == io.EOF {
// you are done, numBytes might be less than len(p)
break
}
// do what you want with p
}
If you are getting the data from a webserver, you might even do
import (
"net/http"
"io/ioutil"
)
...
resp, errGet := http.Get("http://example.com/somefile")
// do error handling
z, errZ := zlib.NewReader(resp.Body)
// do error handling
resp.Body.Close()
p, err := ioutil.ReadAll(b)
// do error handling
since resp.Body happens to be an io.Reader as most io related types.