AdSense lots of requests slowing down site - performance

Working on optimizing this site (lanogkreditt.com) for speed. Half the loading now seems to come from Google AdSense. Is it normal that AdSense makes so many requests and slows the site down this much?
Thanks

Unfortunately yes.
Even Google’s PageSpeed Insights detect problems with browser caching and image optimizing for AdSense.
Turn off AdSense and try to optimize the site for speed as much as possible…
Also, don’t show ads from other URL, if no ads available (ad unit options).

Related

PageSpeed Insights for Global Scores

We are in the US and using PageSpeed Insights, but really wanting to know how our website performs for our Chinese (and other global customers). We don't have a specific URL to pass in - localization is IP/cookie based. Is there a solution that will allow us, via the API, to pull back metrics for other locations?

advantage of using recaptcha

After reading through the documentation, i understand that recaptcha makes it difficult for the bots to do a form submission. This reduces spam for sure.
Apart from this, is there other advantage of using recaptcha?
Some articles were indicating that from a proxy or a virtual machine(for the first time), recaptcha is triggered. But is this really needed or rather what is the advantage of this?
Also, whether recaptcha does something to prevent bots crawling the website? I do not think that might be a case because this may affect search engine crawlers also.
From the documentation, "reCAPTCHA protects you against spam and other types of automated abuse." what are the other types of automated abuses in this context?
Well it doesn't matter if the bot is friendly or malicious. Some webmasters don't want bots on their website, and some bots do not respect robot.txt that would tell the bots to keep off their lawn. Besides, web crawlers should not be on the pages that require the user to post information about themselves.
To quote the website, "reCAPTCHA offers more than just spam protection. Every time our CAPTCHAs are solved, that human effort helps digitize text, annotate images, and build machine learning datasets. This in turn helps preserve books, improve maps, and solve hard AI problems."

Google Sites HTTPS issue

I'm wondering if anyone can help with this.
I'm creating a site for a client using Google Sites (A requirement they set).
One of their requirement is for a contact form to be embedded on the site. I've had a look and there are plenty out there, however, if a user visits from any version of IE the content is not displayed due to the security settings.
All other browsers are functioning fine.
I know the alternative is to simply put a link to an external source, but is not ideal.
My question is threefold fold.
1. Is it possible to write a gadget that will work for IE with non-secure content (if so how)?
2. Are there any HTTPS contact forms out there that I could use?
3. Does anyone have any experience with Google sites and trying to load non-secure content and have any tips?
Thanks
Have you tried JotForm.com? They have the same (free and premium) plans as emailmeform.com. Plus, they have a specific roundabouts to embed your form in Google Sites (they have a gadget made for Google Sites). And yes, JotForm has https url for their forms if you wish to embed it as an iframe.
-- One other solution is resort to using Google Docs form.
Does anyone have any experience with Google sites and trying to load
non-secure content and have any tips? Still awaiting people with
experience....
-- Yes, I have experienced this while trying to put some social media scripts in my Google Sites website and the best thing really was to rid my Sites of those non-secure contents.
For any interested I have kind of answered my questions.
Is it possible to write a gadget that will work for IE with non-secure content (if so how)?
It is possible but you need to have a SSL hosted server.
Are there any HTTPS contact forms out there that I could use?
There are paid solutions for this. Alternatively, write your own html code to post to one of these solutions (free solution is http://www.emailmeform.com/)
Does anyone have any experience with Google sites and trying to load non-secure content and have any tips?
Still awaiting people with experience....

What might cause IE8 to show phishing warning page?

We have a situation where we took a client site online and for some reason, IE8 shows the bright red phishing page when you go to the site.
We have our normal Google Analytics code at the bottom of the page but we are also trying to track a poll in the middle of the page using Google Analytics. We're not sure if this might be the cause?
The other thing, which might be closer to the answer is that we had set the test site up on a sub domain on our own server when the client was doing QA testing before it went live.
Could this have been the cause for IE to pick up that it might be a phishing site?
Thanks in advance!
Well according to the IE phishing filter FAQ page:
Q. What does it mean when a Web site
is blocked and flagged in red as a
reported phishing Web site?
A. A
reported phishing Web site has been
confirmed by reputable sources as
fraudulent and has been reported to
Microsoft. We recommend you do not
give any information to such Web
sites.
Your concern about the subdomain should not produce anything more than a yellow bar warning - if it has not been reported before:
Q. What does it mean when a Web site
is flagged yellow and "suspicious"?
A.
A suspicious Web site has some of the
typical characteristics of phishing
Web sites, but it is not on the list
of reported phishing Web sites. The
Web site might be legitimate, but you
should be cautious about entering any
personal or financial information
unless you are certain that the site
is trustworthy.

How to prevent Googlebot from overwhelming site?

I'm running a site with a lot of content, but little traffic, on a middle-of-the-road dedicated server.
Occasionally, Googlebot will stampede us, resulting in Apache maxing out its memory, and causing the server to crash.
How can I avoid this?
register at google webmaster tools, verify your site and throttle google bot down
submit a sitemap
read the google guildelines: (if-Modified-Since HTTP header)
use robot.txt to restrict access from to bot to some parts of the website
make a script that changes the robot.txt each $[period of time] to make sure the bot is never able to crawl too many pages at the same time while making sure it can crawl all the content overall
You can set how your site is crawled using google's webmaster tools. Specifically take a look at this page: Changing Google's crawl rate
You can also restrict the pages that the google bot searches using a robots.txt file. There is a setting available for crawl-delay, but it appears that it is not honored by google.
Register your site using the Google Webmaster Tools, which lets you set how often and how many requests per second googlebot should try to index your site. Google Webmaster Tools can also help you create a robots.txt file to reduce the load on your site
Note that you can set the crawl speed via Google Webmaster Tools (under Site Settings), but they only honour the setting for six months! So you have to log in every six months to set it again.
This setting was changed in Google. The setting is only saved for 90 days now (3 months, not 6).
You can configure the crawling speed in google's webmaster tools.
To limit the crawl rate:
On the Search Console Home page, click the site that you want.
Click the gear icon Settings, then click Site Settings.
In the Crawl rate section, select the option you want and then limit the crawl rate as desired.
The new crawl rate will be valid for 90 days.

Resources