Multi line buffered read in go - go

I am trying to read file in buffered manner because I have very large files. I want to apply some text replacement on a file. Suppose for each read I search for a word 'foo' and replace it with some other word 'bar'. If I read using buffer of some size 5MB then it may be the case foo will split into two reads may be one read 'fo' and another read 'o' then I will not be able to find that word. Is there a way so that I can use buffered read upto last newline or may be read multiple line in buffer
I did below. But It will not read upto next line or previous line
file, err := os.Open(filename)
if err != nil {
panic(err)
}
defer file.Close()
byteSlice := make([]byte, 5*1024*1024) // read 5 MB
bufioreader := bufio.NewReaderSize(file, bufferSize)
for {
n, err := bufioreader.Read(byteSlice)
if n > 0 {
fmt.Println(byteSlice[:n])
} else if err == io.EOF {
break
} else {
panic(err)
}
}

Since you're using the bufio reader, you shouldn't really work on aligning the input with buffer boundaries yourself. Use one of the high level read functions, such as `bufioreader.ReadString('\n'), which will read a line using the underlying buffer, and you won't have to deal with line delimiters yourself.

You don’t need bufio reader if you have your own buffer. With your code you have a useless copy of data from the buffer in bufio to the byteslice.
Regarding the split "foo" problem, the solution is to move the last 2 characters from the buffer to the front before the next read.
More precisely, if the word to replace has length m, the copy the m-1 last letters of the buffer to the front of the buffer, fill the remain of the buffer and search for the word to replace in the buffer.
// assume we want to find word
file, err := os.Open(filename)
if err != nil {
panic(err)
}
defer file.Close()
trailingLen := len(word)-1
dataLen := 5*1024*1024 + trailingLen
data := make([]byte, dataLen) // read 5 MB
for {
n, err := file.Read(data[trailingLen:])
if err != nil {
if err == io.EOF {
break
}
panic(err)
}
// search and replace word in data[:n]
if n == dataLen {
copy(data, data[dataLen-trailingLen:])
}
}

Related

How to read arbitrary amounts of data directly from a file in Go?

Without reading the contents of a file into memory, how can I read "x" bytes from the file so that I can specify what x is for every separate read operation?
I see that the Read method of various Readers takes a byte slice of a certain length and I can read from a file into that slice. But in that case the size of the slice is fixed, whereas what I would like to do, ideally, is something like:
func main() {
f, err := os.Open("./file.txt")
if err != nil {
panic(err)
}
someBytes := f.Read(2)
someMoreBytes := f.Read(4)
}
bytes.Buffer has a Next method which behaves very closely to what I would want, but it requires an existing buffer to work, whereas I'm hoping to read an arbitrary amount of bytes from a file without needing to read the whole thing into memory.
What is the best way to accomplish this?
Thank you for your time.
Use this function:
// readN reads and returns n bytes from the reader.
// On error, readN returns the partial bytes read and
// a non-nil error.
func readN(r io.Reader, n int) ([]byte, error) {
// Allocate buffer for result
b := make([]byte, n)
// ReadFull ensures buffer is filled or error is returned.
n, err := io.ReadFull(r, b)
return b[:n], err
}
Call like this:
someBytes, err := readN(f, 2)
if err != nil { /* handle error here */
someMoreBytes := readN(f, 4)
if err != nil { /* handle error here */
you can do something like this:
f, err := os.Open("/tmp/dat")
check(err)
b1 := make([]byte, 5)
n1, err := f.Read(b1)
check(err)
fmt.Printf("%d bytes: %s\n", n1, string(b1[:n1]))
for more reading please check site.

io.Reader and Line Break issue involving a CSV file

I have an application which deals with CSV's being delivered via RabbitMQ from many different upstream applications - typically 5000-15,000 rows per file. Most of the time it works great. However a couple of these upstream applications are old (12-15 years) and the people who wrote them are long gone.
I'm unable to read CSV files from these older aplications due to the line breaks. I'm finding this a bit weird as the line breaks see to map to UTF-8 Carriage Returns (http://www.fileformat.info/info/unicode/char/000d/index.htm). Typically the app reads in only the headers from those older files and nothing else.
If I open one of these files in a text editor and save as utf-8 encoding overwriting the exiting file then it works with no issues at all.
Things I've tried I expected to work:
-Using a Reader:
ba := make([]byte, 262144000)
if _, err := file.Read(ba); err != nil {
return nil, err
}
ba = bytes.Trim(ba, "\x00")
bb := bytes.NewBuffer(ba)
reader := csv.NewReader(bb)
records, err := reader.ReadAll()
if err != nil {
return nil, err
}
-Using the Scanner to read line by line (get a bufio.Scanner: token too long)
scanner := bufio.NewScanner(file)
var bb bytes.Buffer
for scanner.Scan() {
bb.WriteString(fmt.Sprintf("%s\n", scanner.Text()))
}
// check for errors
if err = scanner.Err(); err != nil {
return nil, err
}
reader := csv.NewReader(&bb)
records, err := reader.ReadAll()
if err != nil {
return nil, err
}
Things I tried I expected not to work (and didn't):
Writing file contents to a new file (.txt) and reading the file back in (including running dos2unix against the created txt file)
Reading file into a standard string (hoping Go's UTF-8 encoding would magically kick in which of course it doesn't)
Reading file to Rune slice, then transforming to a string via byte slice
I'm aware of the https://godoc.org/golang.org/x/text/transform package but not too sure of a viable approach - it looks like the src encoding needs to be known to transform.
Am I stupidly overlooking something? Are there any suggestions how to transform these files into UTF-8 or update the line endings without knowing the file encoding whilst keeping the application working for all the other valid CSV files being delivered? Are there any options that don't involve me going byte to byte and doing a bytes.Replace I've not considered?
I'm hoping there's something really obvious I've overlooked.
Apologies - I can't share the CSV files for obvious reasons.
For anyone who's stumbled on this and wants an answer that doesn't involve strings.Replace, here's a method that wraps an io.Reader to replace solo carriage returns. It could probably be more efficient, but works better with huge files than a strings.Replace-based solution.
https://gist.github.com/b5/78edaae9e6a4248ea06b45d089c277d6
// ReplaceSoloCarriageReturns wraps an io.Reader, on every call of Read it
// for instances of lonely \r replacing them with \r\n before returning to the end customer
// lots of files in the wild will come without "proper" line breaks, which irritates go's
// standard csv package. This'll fix by wrapping the reader passed to csv.NewReader:
// rdr, err := csv.NewReader(ReplaceSoloCarriageReturns(r))
//
func ReplaceSoloCarriageReturns(data io.Reader) io.Reader {
return crlfReplaceReader{
rdr: bufio.NewReader(data),
}
}
// crlfReplaceReader wraps a reader
type crlfReplaceReader struct {
rdr *bufio.Reader
}
// Read implements io.Reader for crlfReplaceReader
func (c crlfReplaceReader) Read(p []byte) (n int, err error) {
if len(p) == 0 {
return
}
for {
if n == len(p) {
return
}
p[n], err = c.rdr.ReadByte()
if err != nil {
return
}
// any time we encounter \r & still have space, check to see if \n follows
// if next char is not \n, add it in manually
if p[n] == '\r' && n < len(p) {
if pk, err := c.rdr.Peek(1); (err == nil && pk[0] != '\n') || (err != nil && err.Error() == io.EOF.Error()) {
n++
p[n] = '\n'
}
}
n++
}
return
}
Have you tried to replace all line endings from \r\n or \r to \n ?

Dynamic FlatBuffers delimiter

I'm using flatbuffer to send binary data over unix socket. The flatbuffer that I send is of dynamic length. The problem I'm facing is, how to know how many bytes I have to read for one table.
Is there something like a delimiter that can be appended while sending, which I can use to determine the end of the flatbuffer.
When I tried with a smaller size
buf := make([]byte, 512)
nr, err := c.Read(buf)
if err != nil {
fmt.Println("exit echo")
return
}
And if the flatbuffer that is bigger than 512 bytes is read, then this results in failure.
When I read by growing my buffer, then I'm not able to find the end of the read
var n, nr int
var err error
buf := make([]byte, 0, 4096) // big buffer
tmp := make([]byte, 512)
for {
n, err = c.Read(tmp)
if err != nil {
break
}
nr += n
if nr >= 4096 {
err = errOverrun
break
}
buf = append(buf, tmp[:n]...)
}
if err != nil {
fmt.Println("read error:", err)
break
}
FlatBuffers does not include a length field by design, since in most context the length is an implicit part of the storage or transfer of a buffer.
If you have no way to know the size of a buffer, or you are streaming buffers, the best is to simply pre-fix any buffer with a 32bit length field, so you can use that to read the rest of the data.
In the C++ API this is even built-in (see SizePrefixed functions), but this hasn't been ported to Go yet, so you'd have to do it manually.

How to read a text file line-by-line in Go when some lines are long enough to cause "bufio.Scanner: token too long" errors?

I have a text file where each line represents a JSON object. I am processing this file in Go with a simple for loop like this:
scanner := bufio.NewScanner(file)
for scanner.Scan() {
jsonBytes = scanner.Bytes()
var jsonObject interface{}
err := json.Unmarshal(jsonBytes, &jsonObject)
// do stuff with "jsonObject"...
}
if err := scanner.Err(); err != nil {
log.Fatal(err)
}
When this code reaches a line with a particularly large JSON string (~67kb), I get the error message, "bufio.Scanner: token too long".
Is there an easy way to increase the max line size readable by NewScanner? Or is there another approach you can take altogether, when needing to read lines that are too large for NewScanner but are known to not be of unsafe size generally?
You can also do:
scanner := bufio.NewScanner(file)
buf := make([]byte, 0, 64*1024)
scanner.Buffer(buf, 1024*1024)
for scanner.Scan() {
// do your stuff
}
The second argument to scanner.Buffer() sets the maximum token size. In the above example you will be able to scan the file as long as none of the lines is larger than 1MB.
From the package docs:
Programs that need more control over error handling or large tokens,
or must run sequential scans on a reader, should use bufio.Reader
instead.
It looks like the preferred solution is bufio.Reader.ReadLine.
You surely don't want to be reading line-by-line in the first place. Why don't you just do this:
d := json.NewDecoder(file)
for {
var ob whateverType
err := d.Decode(&ob)
if err == io.EOF {
break
}
if err != nil {
log.Fatalf("Error decoding: %v", err)
}
// do stuff with "jsonObject"...
}

How to find EOF while reading from a file

I am using the following code to read a file in Go:
spoon , err := ioutil.ReadFile(os.Args[1])
if err!=nil {
panic ("File reading error")
}
Now I check for every byte I pick for what character it is. For example:
spoon[i]==' ' //for checking space
Likewise I read the whole file (I know there maybe other ways of reading it)
but keeping this way intact, how can I know that I have reached EOF of the file and I should stop reading it further?
Please don't suggest to find the length of spoon and start a loop. I want a sure shot way of finding EOF.
Use io.EOF to test for end-of-file. For example, to count spaces in a file:
package main
import (
"fmt"
"io"
"os"
)
func main() {
if len(os.Args) <= 1 {
fmt.Println("Missing file name argument")
return
}
f, err := os.Open(os.Args[1])
if err != nil {
fmt.Println(err)
return
}
defer f.Close()
data := make([]byte, 100)
spaces := 0
for {
data = data[:cap(data)]
n, err := f.Read(data)
if err != nil {
if err == io.EOF {
break
}
fmt.Println(err)
return
}
data = data[:n]
for _, b := range data {
if b == ' ' {
spaces++
}
}
}
fmt.Println(spaces)
}
ioutil.ReadFile() reads the entire contents of the file into a byte slice. You don't need to be concerned with EOF. EOF is a construct that is needed when you read a file one chunk at a time. You need to know which chunk has reached the end of the file when you're reading one chunk at a time.
The length of the byte slice returned by ioutil.ReadFile() is all you need.
data := ioutil.ReadFile(os.Args[1])
// Do we need to know the data size?
slice_size := len(data)
// Do we need to look at each byte?
for _,byte := range data {
// do something with each byte
}
This is what you need to look for to find out about End Of File(EOF)
if err != nil {
if errors.Is(err, io.EOF) { // prefered way by GoLang doc
fmt.Println("Reading file finished...")
}
break
}
When you use ioutil.ReadFile(), you don't ever see io.EOF, by design, because ReadFile will read the whole file until EOF is reached. So the slice it returns is the whole file. From the doc:
ReadFile reads the file named by filename and returns the contents. A successful call returns err == nil, not err == EOF. Because ReadFile reads the whole file, it does not treat an EOF from Read as an error to be reported.
From your question, you explicitly mention that you are aware there are other ways to read the file, and some of those ways require you to test the error for io.EOF, but not ReadFile.
Then, with the slice you have, you can read the file using the for...range construct, as others have mentioned. This is a sure way to read the whole file and nothing more (again, ReadFile takes care of that). Or iterating from 0 to len(spoon) - 1 would work too, but range is more idiomatic and basically does the same.
In other words: when you reach the end of the slice, you reach the end of the file (provided ReadFile did not return an error).
A slice has no concept of end of file. The slice returned by ioutil.ReadFile has a specific length, which reflects the size of the file it was read from. A common idiom, but only one of the possible used in this case, is to range the slice, effectively "consuming" all of the bytes, originally sitting in the file:
for i, b := range spoon {
// At index 'i' is byte 'b'
// At file's offset 'i', 'b' was read
... do something useful here
}

Resources