Why are request.URL.Host and Scheme blank in the development server? - go

I'm very new to Go. Tried this first hello, world from the documentation, and wanted to read the Host and Scheme from the request:
package hello
import (
"fmt"
"http"
)
func init() {
http.HandleFunc("/", handler)
}
func handler(w http.ResponseWriter, r *http.Request) {
fmt.Fprint(w, "Host: " + r.URL.Host + " Scheme: " + r.URL.Scheme)
}
But their values are both blank. Why?

Basically, since you're accessing the HTTP server not from an HTTP proxy, a browser can issue a relative HTTP request, like so:
GET / HTTP/1.1
Host: localhost:8080
(Given that, of course, the server is listening on localhost port 8080).
Now, if you were accessing said server using a proxy, the proxy may use an absolute URL:
GET http://localhost:8080/ HTTP/1.1
Host: localhost:8080
In both cases, what you get from Go's http.Request.URL is the raw URL (as parsed by the library). In the case you're getting, you're accessing the URL from a relative path, hence the lack of a Host or Scheme in the URL object.
If you do want to get the HTTP host, you may want to access the Host attribute of the http.Request struct. See http://golang.org/pkg/http/#Request
You can validate that by using netcat and an appropriately formatted HTTP request (you can copy the above blocks, make sure there's a trailing blank line after in your file). To try it out:
cat my-http-request-file | nc localhost 8080
Additionally, you could check in the server/handler whether you get a relative or absolute URL in the request by calling the IsAbs() method:
isAbsoluteURL := r.URL.IsAbs()

Related

pac file return a proxy without port number

Yesterday, I received a pac file which defines only proxy but not port number,
like this one:
function FindProxyForURL(url, host) {
if (shExpMatch(host, "vpn.domain.com"))
return "PROXY proxy.mydomain.com";
}
Then I used
ipconfig /displaydns
netstat -n
cmds to test this pac file, found that,
[proxy.mydomain.com] always uses port 80 in either HTTP or HTTPS.
My question is, why does it use port 80, is it some kind of default definition?

Why does http.Get("http://[::]:1234") work?

I was writing a test where I wanted an HTTP server to listen on a random port and then connect to this port. I wrote:
mux := http.NewServeMux()
mux.HandleFunc("/", func(w http.ResponseWriter, r *http.Request) {
fmt.Println("foo")
})
listener, err := net.Listen("tcp", ":0")
fmt.Println(err)
httpServer := &http.Server{Handler: mux}
go httpServer.Serve(listener)
fmt.Println("http://" + listener.Addr().String())
r, err := http.Get("http://" + listener.Addr().String())
fmt.Println(r)
fmt.Println(err)
I meant to write net.Listen("tcp", "127.0.0.1:0") but accidentally wrote net.Listen("tcp", ":0").
For "http://" + listener.Addr().String() it prints:
http://[::]:1709
where as far as I understand the "bracket colon colon bracket" means "all interfaces".
To my surprise, the http.Get("http://[::]:1709") works, it connects to the webserver and "foo" is printed.
How is "http://[::]:1709" a valid address?
At least on a Linux system, this results in a connection being made to localhost.
The address :: is IN6ADDR_ANY, typically used when listening to listen for connections to any IPv6 address on the system. It's analogous to INADDR_ANY, also known as 0.0.0.0 in IPv4.
Occasionally someone attempts to use one of these addresses as the destination address for an outgoing connection. When this happens:
When making an outgoing connection to 0.0.0.0, Linux actually connects from 127.0.0.1 to 127.0.0.1.
Similarly, when making an outgoing connection to ::, Linux actually connects from ::1 to ::1. Here is an example, taken from one of my websites (which happens to be an IP address lookup tool):
[error#murloc ~]$ curl -k -H "Host: myip.addr.space" https://[::]:8443/
::1
For completeness, here is the IPv4 version:
[error#murloc ~]$ curl -k -H "Host: myip.addr.space" https://0.0.0.0:8443/
127.0.0.1
Note that this is OS-specific. You would have received an error on Windows.

Why would a web server reply with 301 and the exact location that was requested?

I'm trying to retrieve pages from a web server via https, using lua with luasec. For most pages my script works as intended, but if the ressource contains special characters (like ,'é), I'm being sent into a loop with 301 responses.
let this code sniplet illustrate my dilemma (actual server details redacted to protect the innocent):
local https = require "ssl.https"
local prefix = "https://www.example.com"
local suffix = "/S%C3%A9ance"
local body,code,headers,status = https.request(prefix .. suffix)
print(status .. " - GET was for \"" .. prefix .. suffix .. "\"")
print("headers are " .. myTostring(headers))
print("body is " .. myTostring(body))
if suffix == headers.location then
print("equal")
else
print("not equal")
end
local body,code,headers,status = https.request(prefix .. headers.location)
print(status .. " - GET was for \"" .. prefix .. suffix .. "\"")
which results in the paradoxical
HTTP/1.1 301 Moved Permanently - GET was for "https://www.example.com/S%C3%A9ance"
headers are { ["content-type"]="text/html; charset=UTF-8";["set-cookie"]="PHPSESSID=e80oo5dkouh8gh0ruit7mj28t6; path=/";["content-length"]="0";["connection"]="close";["date"]="Wed, 15 Mar 2017 19:31:24 GMT";["location"]="S%C3%A9ance";}
body is ""
equal
HTTP/1.1 301 Moved Permanently - GET was for "https://www.example.com/S%C3%A9ance"
How might one be able to retrieve the elusive pages, using lua and as little additional dependencies as possible?
Obvious as it may seem, perhaps the requested url does differ from the actual location.
If you have a similar problem, do check deep within your external libraries to make sure they do what you think they do.
In this case, luasocket did urldecode and then urlencode the url and thus the final request was not what it seemed to be.

tutorialspoint's simple web browser using tcpsocket

This piece of code supposedly gets the content of any web page:
require 'socket'
host = 'www.tutorialspoint.com' # The web server
port = 80 # Default HTTP port
path = "/index.htm" # The file we want
# This is the HTTP request we send to fetch a file
request = "GET #{path} HTTP/1.0\r\n\r\n"
socket = TCPSocket.open(host,port) # Connect to server
socket.print(request) # Send request
response = socket.read # Read complete response
# Split response at first blank line into headers and body
headers,body = response.split("\r\n\r\n", 2)
puts headers
puts body
When I run it in the command line, I get a 404 Error, but when i go to www.tutorialspoint.com/index.htm it's there, so what gives?:
404 Error Information
Although, I don't have trouble using the open-uri library to get the contents of a web page. But I want to know how to use this one though.
Your request misses the Host parameter:
host = 'www.tutorialspoint.com' # The web server
port = 80 # Default HTTP port
path = "/index.htm" # The file we want
# This is the HTTP request we send to fetch a file
request = "GET #{path} HTTP/1.0\r\nHost: #{host}\r\n\r\n"
Note that apparently not all and every webserver require the "Host:" line (but see comments).

getredirecturl not appending http

I am using to get the targeturl using
httpSessionRequestCache.getRequest(request, response).getRedirectUrl();
it's only returning localhost:8080 because of this the redirection is not happening
I am expecting this should return as
http://localhost:8080
Any suggestions?

Resources