Creating a byte array of a http request and then trying to read it into a http.request doesn't seem to work when the request includes a body.
req, _ := http.NewRequest(http.MethodPost, "/Bar", strings.NewReader("Foo"))
rReq, _ := httputil.DumpRequest(req, true)
req2, _ := http.ReadRequest(bufio.NewReader(bytes.NewReader(rReq)))
b, _ := ioutil.ReadAll(req2.Body)
fmt.Println(b)
b is an empty array.
Two things are wrong in your code:
You must handle the errors. This would have helpesd you see that you never construct a valid request ("/Bar" is not a valid URL).
Use httputil.DumpRequestOut for outgoing request.
Takeaways: Always handle all errors and always read the whole whole package doc.
Related
Does anyone know how to fix this error?
I use Golang to insert data into elasticsearch, but it seems that there is no data inserted because of this error.
{"error":"Content-Type header [] is not supported","status":406}
I already set the content type. Note that I use elasticsearch 6.4.3
request, err := http.NewRequest("POST", urlSearch, bytes.NewBuffer(query))
request.Close = true
request.Header.Set("Content-Type", "application/json")
Last but not least, I use elastigo package to make requests to elasticsearch.
That's a strange response, as it suggests that this line:
request.Header.Set("Content-Type", "application/json")
Failed to add the value to the key slice. In modern go that does not happen, e.g.
data := []byte(`{"a":1}`)
req, err := http.NewRequest("POST", "", bytes.NewBuffer(data))
if err != nil {
fmt.Println(err)
return
}
req.Header.Set("Foo", "Bar")
fmt.Printf("%v\n", req.Header)
Prints
map[Foo:[Bar]]
See go playground.
Are you using older version of Go that doesn't match that behavior? (I'm on 1.11.2 locally.)
Five suggestions:
(1) Handle the err return value from NewRequest to verify there's no problem there (see example above).
(2) Print the request Header value before send to verify it looks right at that point (see example above).
(3) Try the Add method for the Content-Type header instead of Set as an alternative:
func (h Header) Add(key, value string)
(4) Verify that you're not going through a proxy that strips header values.
(5) Verify that "application/json" is an acceptable content type for the endpoint you're hitting, as the empty value in the error response could be erroneous itself.
Good luck!
I'm writing a webhook in Go that parses a JSON payload. I'm attempting to log the raw payload and then decode it immediately after but it fails when I try. If I perform the actions separately, they both work fine independently.
Can someone explain why I can't use ioutil.ReadAll and json.NewDecoder together?
func webhook(w http.ResponseWriter, r *http.Request) {
body, _ := ioutil.ReadAll(r.Body)
log.Printf("incoming message - %s", body)
var p payload
decoder := json.NewDecoder(r.Body)
err := decoder.Decode(&p)
if err != nil {
// Returns EOF
log.Printf("invalid payload - %s", err)
}
defer r.Body.Close()
}
Can someone explain why I can't use ioutil.ReadAll and json.NewDecoder
together?
The request body is an io.ReadCloser that reads bytes, more or less, directly from a network connection. The contents of the Body aren't stored in memory by default. That's why after the first time you've read the Body the next time you try to read it you'll get EOF.
So if you need to process the request Body more than once, you yourself will have to store the contents into memory, which is what you are already doing with:
body, _ := ioutil.ReadAll(r.Body)
You can then reuse body as many times as you like, and since you have the Body contents at your disposal as a []byte value, you can use json.Unmarshal instead of json.NewDecoder(...).Decode.
This is unrelated to your question, but please do not ignore the error returned from ioutil.ReadAll.
Also you can drop the defer r.Body.Close() line, because you do not have to close the request body in your server handlers. (emphasis mine)
For server requests the Request Body is always non-nil but will return
EOF immediately when no body is present. The Server will close the
request body. The ServeHTTP Handler does not need to.
r.Body is meant to be read exactly once.
When you use the ioutil.ReadAll function you do read all the data from the body. That's why the decoder which also relies on r.Body in fact gets nothing to decode.
Minor additional point about json.Decoder and json.Unmarshal: at first glance it looks like the only difference between the two is just that the former operates on a stream and the latter on a []byte, but they actually have different semantics.
json.Unmarshal will return an error if the data contains more than one json object. So, for example, it will parse {}, but it will not parse {}{}.
json.Decoder parses one complete object per call to Decode, so if you give it {}{}, it will parse those two objects and then the third call will return io.EOF and it's More method will return false.
In a normal http body, you probably only want a single object, so you'd want to use Unmarshal if you're not worried about loading all the data into memory at once. You can also use Decoder and manually check that there is only one object if you care to do so.
I have a http request which I need to inspect the body of. But when I do, the request fails. I'm assuming this had to do with the Reader needing to be reset, but googling along the lines of go ioutil reset ReadCloser hasn't turned anything up that looks promising.
c is a *middleware.Context,
c.Req.Request is a http.Request, and
c.Req.Request.Body is an io.ReadCloser
contents, _ := ioutil.ReadAll(c.Req.Request.Body)
log.Info("Request: %s", string(contents))
proxy.ServeHTTP(c.RW(), c.Req.Request)
Specifically the error I get is http: proxy error: http: ContentLength=133 with Body length 0
You can't reset it, because you've already read from it and there's nothing left in the stream.
What you can do is take the buffered bytes you already have, and replace the Body with a new io.ReadCloser
contents, _ := ioutil.ReadAll(c.Req.Request.Body)
log.Info("Request: %s", string(contents))
c.Req.Request.Body = ioutil.NopCloser(bytes.NewReader(contents))
proxy.ServeHTTP(c.RW(), c.Req.Request)
I reading big file and sending this file by http POST.
I use bufio.
And now I want to modify one of first line of this file, how to do it ?
f := bufio.NewReaderSize(os.Stdin, 65536)
bufPart, err := f.Peek(65536))
//how to modify bufPart(f) ?
...
req, err := http.NewRequest("POST", url, f)
Two ideas how to do it:
Create your own Reader implementation that wraps an bufio.Reader and implements replacing logic (you will have to count number of read bytes).
Call io.Pipe, pass the returned PipeReader to NewRequest and start a separate goroutine that will read data from a file, modify it and write to the returned PipeWriter.
Hope this makes sense.
I wrote a cookie getter and setter. Now I want to test it, and wrote following test function.
func TestAuthorizationReader(t *testing.T) {
tw := httptest.NewServer(testWriter())
tr := httptest.NewServer(Use(testReader()))
defer tw.Close()
defer tr.Close()
c := &http.Client{}
rs, err := c.Get(tw.URL)
assert.NoError(t, err, "Should not contain any error")
// Assign cookie to client
url, err := rs.Location()
fmt.Print(url)
assert.NoError(t, err, "Should not contain any error")
//c.Jar.SetCookies(url, rs.Cookies())
}
The test fail at the second part, as output message I've got
- FAIL: TestAuthorizationReader (0.05s)
Location: logged_test.go:64
Error: No error is expected but got http: no Location header in response
Messages: Should not contain any error
I can not get the URL location pointer, what do I wrong here?
The Response.Location method returns the value of the Location response header. You would usually only expect to see this header for redirect responses, so it isn't surprising you got this error.
If you want to know the URL used to retrieve a particular response, try rs.Request.URL.String(). Even if the HTTP library followed a redirect to retrieve the document, this will look at the request used for this particular response, which is what you'd be after when determining a cookie's origin.
If you just want the client to keep track of cookies set by requests it processes though, all you should need to do is set the Jar attribute on your client. Something like this:
import "net/http/cookiejar"
...
c := &http.Client{
Jar: cookiejar.New(nil),
}
Now cookies set in earlier responses should be set in future requests to the same origin.