Google shows some of my pages in https - https

I had the SSL installed on my site for a day and uninstalled it. Now in serps, google shows some of the pages in https version. I don't think this is a duplicate content issue because only one version of the web page shows up in serps. I'm not sure how to fix this because the mix of http hand https show up.
Why does google randomly show https version of my pages?
How can i tell google to use the http version of those random pages?
Thanks in advance!

I think more information is needed to fully answer this question, however, most likely Google is showing the HTTPS version of your page because your site was using HTTPs at the time when Google visited and indexed your page, thus the HTTPS version was cached.
You can partially manage your search page and request that Google re-index your site from Google Web Developer tools. https://www.google.com/webmasters/tools/. However, the change isn't guaranteed and won't take place immediately.
Furthermore, you can control whether a visitor uses HTTPS or HTTP when visiting your site via standard redirects. In other words, update your webserver or web application so if someone visits http://mysite they will be redirected (preferably with a 301 status code for SEO reasons) to https://mysite.
Finally, as a general practice, you should really use HTTPS as browsers may highlight warnings for non-secured traffic in the future. For sites I manage, "http" requests are automatically redirected to the "https" of the same URL to ensure all traffic is always over a secure connection.

Related

How do I proxy API requests in a JAMstack solution?

I'm developing a site that's virtually entirely static. I use a generator to create all the HTML.
However, my site is a front-end to a store embedded in its pages. I have a little node.js server proxying requests on behalf of the browser to the back-end store. All it does is provide the number of items in the shopping cart so I can keep the number updated on all pages of my site. That's because the browser doesn't allow cross-domain scripting. My server has to act as a proxy between the client and the store.
(The embedded store is loaded from the store's web site and so itself does not require proxying.)
I was hoping to eventually deploy to Netlify or some similar JAMstack provider. But I don't see how I'd proxy on Netlify.
What is the standard solution to this problem? Or is proxying unavailable to JAMstack solutions? Are there JAMstack providers that solve this problem?
Netlify does allow for proxy rewrites using redirect paths with status code 200.
You can store your proxy redirects in _redirects at the root of your deployed site. In other words the file needs to exist at the root of the site directory to be deployed after a build.
_redirects
/api/* https://api.example.com/:splat 200
So a call to:
/api/v1/gifs/random?tag=cat&api_key=your_api_key
will be proxied to:
https://api.example.com/v1/gifs/random?tag=cat&api_key=your_api_key
If the API supports standard HTTP caching mechanisms like Etags or Last-Modified headers, the responses will even get cached by CDN nodes.
NOTE: you can also setup your redirects in your netlify.toml

Github pages website is on HTTPS but Rest API is on HTTP

I have github pages website which makes request to a hosted server which is HTTP and my browser blocks it.
Its assignment for university so I don't really want to pay for HTTPS on the server and I can't use something else for the front-end as I have sent the url to my professor expecting that this is where my web app will be hosted.
Is there anything I can do, which doesn't involve paying that much money?
I encountered this same issue when making API calls. Here's an easy workaround:
Get your api endpoint (for this example, let’s say
http://api.catphotos.io/)
Prepend the CORS API link https://cors-anywhere.herokuapp.com/ to
your link
Make sure to leave the http:// on your endpoint.
Your new link should look like this:
https://cors-anywhere.herokuapp.com/http://api.catphotos.io/
Make your API calls with your new link!
Source: johnalcher.me
According to the GitHub Pages help docs:
HTTPS enforcement is required for GitHub Pages sites created after June 15, 2016 and using a github.io domain.
That means you can't disable the https:// redirect. You could use a custom domain which then leaves the https:// stuff up to you.
Or you could move the stuff from the other server to GitHub Pages, assuming it's not too big.

Move from HTTP to HTTPS and Google Analytics Referral

We moved our website from HTTP to HTTPS.
But we are still missing Google Analytics Referrals data from some HTTPS referrals sites.
Could it be because:
Referrals sites still point to our HTTP web pages? (hence HTTPS -> HTTP (301 redirection) -> HTTPS looses the referral data)
Some referrals sites have links with nofollow noreferrer like <a href="https://ourdomain" rel="nofollow noreferrer">. Oddly enough from our history data it looks like noreferrer didn't have any influence even just a few months ago like in April 2017.
some other reasons?
The default value of the meta referrer tag is no-referrer-when-downgrade. This means you lose the referrer information on your existing http links from most https sites.
301 Redirect
When you 301 redirect from an http request to the https version on your website, the referrer information has already been lost during the http request. There is no way to recover it later in the redirect chain.
The solution is to update the links to https. Unfortunately, this can be a big challenge when they're on websites all across the web.
Meta Referrer Tag
Websites can also use the meta referrer tag to override the default value. It is possible to configure this so that the referrer information is not passed along, even on an https to https request.
Google does this by using the origin value for the meta referrer tag and consequently, you know a visitor came from Google, but not what query they used to find your site.
noreferrer
Setting rel="noreferrer" informs browsers that support this attribute not to pass on referrer information for that specific link. However, older browsers don't support this and will still pass on referrer information.
nofollow
This does not affect the referrer information.
It is used to communicate to search engines that the website does not vouch for the link. Most search engines use this information to ignore the link when calculating the link targets ranking. Some search engines also interpret the tag literally and choose not to follow the link at all, while others follow the link sometimes, and still others follow the link as they would a normal link.

'Soft' move to HTTPS

I have a mobile site that I generally want to always be all HTTPS. It uses the HTML5 application cache to store all the data locally so its a fast user experience EXCEPT during the first page view.
My homepage will load on an iPhone over 4G ATT in 1 second without HTTPS and in 4.5 seconds with HTTPS (round trip time latency is the killer here). So, I am trying to find a way to softly move the user to HTTPS so they have a great first impression, followed by a secure experience.
Here is what I am doing:
Externally, always referencing HTTP (ie press releases, etc)
canonical = HTTP (so Google see HTTP)
On the site pages all links are HTTPS
Once the user comes to the HTTP page the HTTPS pages (all of them) will load via an application cache manifest in an iFrame
Upon hitting the server for the first time (HTTP), the server will set a cookie, and the next time force a redirect to HTTPS (which is already cached)
Is this a good implementation to get the speed of HTTP with the security of HTTPS?

Is a change required only in the code of a web application to support HSTS?

If I want a client to always use a HTTPs connection, do I only need to include the headers in the code of the application or do I also need to make a change on the server? Also how is this different to simply redirecting a user to a HTTPs page make every single time they attempt to use HTTP?
If you just have HTTP -> HTTPS redirects a client might still try to post sensitive data to you (or GET a URL that has sensitive data in it) - this would leave it exposed publicly. If it knew your site was HSTS then it would not even try to hit it via HTTP and so that exposure is eliminated. It's a pretty small win IMO - the bigger risks are the vast # of root CAs that everyone trusts blindly thanks to policies at Microsoft, Mozilla, Opera, and Google.

Resources