how to open a website in http form in Mac OS? - macos

I am trying to capture packets on Wireshark for a website. But, since safari opens only the https format, Wireshark is not capturing it - it may be seen in TLS but I want to see the http format. How to do this?

Many websites, including nytimes.com, redirect from HTTP to HTTPS. You can see this with the Terminal command curl, using its -i (show header info) option:
$ curl -i http://nytimes.com
HTTP/1.1 301 Moved Permanently
Server: Varnish
Retry-After: 0
Content-Length: 0
Location: https://www.nytimes.com/
Accept-Ranges: bytes
Date: Wed, 01 Jan 2020 20:27:39 GMT
X-Served-By: cache-pao17428-PAO
X-Cache: HIT
X-Cache-Hits: 0
X-Frame-Options: DENY
Connection: close
X-API-Version: F-0
The 301 status and Location: header tell the client, essentially, "this isn't the URL you want; go load 'https://www.nytimes.com/' instead". Safari (and other browsers) follow this redirect automatically. If you load "http://nytimes.com" in Safari, you'll see that it's both switched to HTTPS and added "www." to the domain name, because that's what the redirect told it to do.
Also, note that the Content-Length: header is 0, and there's nothing but a blank line (that you can't see above) after the header that curl printed. That means there's no actual content at the http:// URL. The server doesn't even bother to serve the page content over HTTP, only over HTTPS.
Some servers/domains go even further to require their clients to use HTTPS. Some are configured to serve an HTTP Strict Transport Security (HSTS) header, which tells the browser to never load anything from that domain over HTTP, but always auto-switch to HTTPS instead. They can also register for HSTS preload, which tells browser developers to include an HSTS policy for the domain without needing to hit the server to get it. nytimes.com doesn't do this, but you can use this site to check other domains. Here's a check on google.com:
$ curl https://hstspreload.com/api/v1/status/google.com
{"domain":"google.com","chrome":{"present":true,"include_subdomains":true,"last_updated":1577844002},"firefox":null,"tor":null}
...which says it's included in Chrome's preload list, but not Firefox's or Tor's. AIUI Safari uses Chrome's list, so google.com should always be auto-switched to HTTPS in Safari.

Your issue is not related to the browser. Many websites will automatically redirect http traffic to https to insure a secure connection.
If you apply the following display filter
http.response.code == 301
You will notice that the webserver is redirecting to the secure https endpoint.
Wireshark capture for http://nytimes.com
If you want to experiment with a website that will not auto-redirect you to https, I suggest using
http://example.com

Follow these steps:
Go to Safari > Preferences.
Click the Advanced tab.
Check the "Show full website address" box.
Close the Preferences window.
And now you can see the full URL in Safari.

Related

Prevent Open URL Redirect from gorilla/mux

I am working on a RESTful web application using Go + gorilla/mux v1.4 framework. Some basic security testing after a release revealed an Open URL Redirection vulnerability in the app that allows user to submit a specially crafted request with an external URL that causes server to response with a 301 redirect.
I tested this using Burp Suite and found that any request that redirects to an external URL in the app seems to be responding with a 301 Moved Permanently. I've been looking at all possible ways to intercept these requests before the 301 is sent but this behavior seems to be baked into the net/http server implementation.
Here is the raw request sent to the server (myapp.mycompany.com:8000):
GET http://evilwebsite.com HTTP/1.1
Accept: */*
Cache-Control: no-cache
Host: myapp.mycompany.com:8000
Content-Length: 0
And the response any time is:
HTTP/1.1 301 Moved Permanently
Location: http://evilwebsite.com/
Date: Fri, 13 Mar 2020 08:55:24 GMT
Content-Length: 0
Despite putting in checks for the request.URL to prevent this type of redirect in the http.handler, I haven't had any luck getting the request to reach the handler. It appears that the base http webserver is performing the redirect without allowing it to reach my custom handler code as defined in the PathPrefix("/").Handler code.
My goal is to ensure the application returns a 404-Not Found or 400-Bad Request for such requests. Has anybody else faced this scenario with gorilla/mux. I tried the same with a Jetty web app and found it returned a perfectly valid 404. I've been at this for a couple of days now and could really use some ideas.
This is not the claimed Open URL redirect security issue. This request is invalid in that the path contains an absolute URL with a different domain than the Host header. No sane client (i.e. browser) can be lured into issuing such an invalid request in the first place and thus there is no actual attack vector.
Sure, a custom client could be created to submit such a request. But a custom client could also be made to interpret the servers response in a non-standard way or visit a malicious URL directly without even contacting your server. This means in this case the client itself would be the problem and not the servers response.

Mixed content errors with https?

This one has me perplexed...
On my website, I am getting Mixed content errors in my console yet when inspecting the source, the urls it says are http are showing as https?
In fact, a search for anything with http:// returns nothing.
Inspection shows:
<img src="https://images.immoafrica.net/aHR0cHM6Ly9yZXZvbHV0aW9uY3JtLXJldm9sdXRpb24tcHJvcGltYWdlcy5zMy5hbWF6b25hd3MuY29tLzU2LzE3MTk4OC8xMjcxOTk0X2xhcmdlLmpwZw==/fb5c609f3c1506a8798dfa620ccf8a15?1=1&width=420&height=310&mode=crop&scale=both&404=default" data-lazy="https://images.immoafrica.net/aHR0cHM6Ly9yZXZvbHV0aW9uY3JtLXJldm9sdXRpb24tcHJvcGltYWdlcy5zMy5hbWF6b25hd3MuY29tLzU2LzE3MTk4OC8xMjcxOTk0X2xhcmdlLmpwZw==/fb5c609f3c1506a8798dfa620ccf8a15?1=1&width=420&height=310&mode=crop&scale=both&404=default" alt="2 Bedroom Apartment for Sale in Strand North" title="2 Bedroom Apartment for Sale in Strand North" class="lazy loading-F5F5F5">
Yet I get this error:
Mixed Content: The page at 'https://www.immoafrica.net/residential/for-sale/south-africa/?advanced-search=1&st=' was loaded over HTTPS, but requested an insecure image 'http://images.immoafrica.net/aHR0cHM6Ly9yZXZvbHV0aW9uY3JtLXJldm9sdXRpb24tcHJvcGltYWdlcy5zMy5hbWF6b25hd3MuY29tLzU2LzE3MTk4OC8xMjcxOTk0X2xhcmdlLmpwZw==/fb5c609f3c1506a8798dfa620ccf8a15?1=1&width=420&height=310&mode=crop&scale=both&404=default'. This content should also be served over HTTPS.
The page is requesting the following https URL:
https://images.immoafrica.net/aHR0cHM6Ly9yZXZvbHV0aW9uY3JtLXJldm9sdXRpb24tcHJvcGltYWdlcy5zMy5hbWF6b25hd3MuY29tLzU2LzE3MTk4OC8xMjcxOTk0X2xhcmdlLmpwZw==/fb5c609f3c1506a8798dfa620ccf8a15?1=1&width=420&height=310&mode=crop&scale=both&404=default
…but the server is redirecting that https URL to the following http URL:
http://images.immoafrica.net/aHR0cHM6Ly9yZXZvbHV0aW9uY3JtLXJldm9sdXRpb24tcHJvcGltYWdlcy5zMy5hbWF6b25hd3MuY29tLzU2LzE3MTk4OC8xMjcxOTk0X2xhcmdlLmpwZw==/fb5c609f3c1506a8798dfa620ccf8a15?1=1&width=420&height=310&mode=crop&scale=both&404=default
Paste that https URL into your browser address bar and you’ll see you end up at the http URL.
Or try it from the command line with something like curl:
$ curl -i 'https://images.immoafrica.net/aHR0cHM6Ly9yZXZvbHV0aW9uY3JtLXJldm9sdXRpb24tcHJvcGltYWdlcy5zMy5hbWF6b25hd3MuY29tLzU2LzE3MTk4OC8xMjcxOTk0X2xhcmdlLmpwZw==/fb5c609f3c1506a8798dfa620ccf8a15?1=1&width=420&height=310&mode=crop&scale=both&404=default'
HTTP/2 301
date: Sat, 06 Jan 2018 01:56:57 GMT
cache-control: max-age=3600
expires: Sat, 06 Jan 2018 02:56:57 GMT
location: http://images.immoafrica.net/aHR0cHM6Ly9yZXZvbHV0aW9uY3JtLXJldm9sdXRpb24tcHJvcGltYWdlcy5zMy5hbWF6b25hd3MuY29tLzU2LzE3MTk4OC8xMjcxOTk0X2xhcmdlLmpwZw==/fb5c609f3c1506a8798dfa620ccf8a15?1=1&width=420&height=310&mode=crop&scale=both&404=default
server: cloudflare
cf-ray: 3d8b1051cfbf84fc-HKG
…and notice th server sends back a 301 response and a location header with the http URL.
So the problem seems to be, that images.immoafrica.net site isn’t served over HTTPS/TLS and instead redirects all requests for https URLs to their http equivalents.
There’s nothing you can do on your end to fix that — other than creating or using some kind of HTTPS proxy through which you make the requests for images.immoafrica.net URLs.
Instead of using https:// use //. This will stop mixed content issues.

HTTP url redirects as HTTPS on selenium test run

When I pass an URL to load a website, say, http://yoururl.com, it redirects to https://yoururl.com
I mean, passing an URL with HTTP automatically redirects as https://yoururl.com in the browser URL.
#driver.get("http://yoururl.com")
Browser used: Chrome
Is there a way to stop redirecting the HTTP url as HTTPS?
The Chrome 63 and above versions will no longer take HTTP with domain .dev since you are in the local/dev environment.
https://iyware.com/dont-use-dev-for-development/
Chrome 63 (out since December 2017), will force all domains ending on
.dev (and .foo) to be redirected to HTTPS via a preloaded HTTP Strict
Transport Security (HSTS) header
https://ma.ttias.be/chrome-force-dev-domains-https-via-preloaded-hsts/
There are couple of reasons this would happen.
Redirection at load balancer or reverse proxy level.
This can be fixed by altering web server or LB configuration.
As browsers getting smarter everyday, when you open an https url is browser then next time if you even want to open http url it'll by default go to https because browser already knows that the site supports https as well. So it'll prefer to use secured communication rather text when it is available.
Here is some help for second case https://superuser.com/questions/565409/chrome-how-to-stop-redirect-from-http-to-https

Serve two pages from two server for one url

I would like to serve two different pages from two servers for one url. Something like facebook does with login page (login page vs profile page on same url).
How will server know what page to serve? I went with cookie because I couldn't think of other solution.
Also cookie removal is needed on logout. I ended up with branch on nginx configuration to push request to right server and removing(setting expired time) cookie there.
Ok and now the bug itself. Chrome caches this url and when user clicks on link(to the same url) chrome skips request to the server and open wrong version from cache. It works when "disable cache" in debug panel is checked and I also confirmed this by checking traffic with wireshark.
To recap urls from browser point of view:
ex.com/ - Server A
ex.com/login_check - Server A -> redirects to / with cookie
ex.com/ - Server B
ex.com/?logout - Server A and remove cookie
ex.com/ - chrome skips request and serves cached content from B
How can be this fixed? Moreover this approach looks like too much magic and many things can go wrong. Could it be done differently?
This can be fixed by adding header for response from both servers.
Cache-Control: private, max-age=0, must-revalidate, no-store;

Google chrome same url cache

I'm testing my servlet using google chrome. When i tried to load the same url twice, say,
localhost/myserver/servlet
chrome only sent out one request to the server. However, if I modified the second url to be:
localhost/myserver/servlet?id=2
it sent two different requests.
I've enabled the incognito mode, but it seems that chrome shares cache and urls between all its incognito tabs.
Caching control is a part of HTTP specification, read something about it. Using HTTP headers like Cache-Control: no-cache or Expires: ... should help you.

Resources