When i am opening my website in slow internet connection then google provide google web light feature then website loading faster. But website is not properly aligned. And load ugly website. I want to disable google web light for the website.
I am using for "no-transform" the disable :
<meta http-equiv="Cache-control" content="no-transform" />
I don't know this right approach or not? And how to check it google web light disabled or not. Because google automatically detect some time load normal user interface and some time load ugly(WITH GOOGLE WEB LIGHT) user interface.
Give me some idea for this..thanks.
I am tested website with in https://www.google.com/webmasters/tools/transcoder?pli=1#url=http%3A%2F%2Fwww.winni.in%2Fbangalore
I have seen one link Can I disable Google Web Light for my website? And also putted <meta http-equiv="Cache-control" content="no-transform"> in header but it is not working.
Google weblight won't be appears on the light and fast website. So, the main key to disable permanently google weblight service is just make your website lighter and faster to load. There are many tricks to make your web lighter and faster than before. First, you can compress your web using gzip compress facility, avoid to use graphic, flash, or other media that can make your web slow to load, use CDN facility, and many other tricks you can try to speed up your website. You can prove it on my official web. Google weblight will never be appears on my official web after do all my tricks. Here I can give you full refference about how to do this.
How to Disable Permanently Google Weblight Service from Your Website
May it's useful.
Related
Well, I trying establish a web page with a wordpress and GoDaddy hosting. I want to make fast web page, because people says fast web pages appear on first line at Google (as specially mobile web page speed is very important people says). So want to make very fast web page but my level of knowledge is not very advanced, I progress by learning.
If I test my web page with Insights, mine mobile score is about 60-70. If I read reports of Insights there are lots of improvements links appear at blow. I want to learn how to fix that. If you help me make an example, I will do the others myself.
If we start at first problem which is /css?family=…(fonts.googleapis.com) this problem seen below of "Eliminate resources that prevent rendering" topic. So how to fix it. What should I do?
Also at the "covorage" tab there are some source codes are seen and it is not using. For example I am not using easy-sheare plugin (secong row at the image) at homepage.
How to remove safely that codes from home page. If I can learn how one is made, I can correct the others myself.
The issue you are running into is something I have seen over and over again. GoDaddy and Wordpress sites generally are bloated and perform poorly.
Here are some tips to improve your speed & get a better PS ranking.
Hosting: Do you need to be on Godaddy? I have seen this time and time again. Most websites on GD are SLOW. GD is good for domain registration, not for hosting. Most non-tech folks do not know any better. Try using Amazon Lightsail, AWS-S3, Google Firebase, or Netlify. They all offer much faster page loads by reducing initial server response time. And they are surprisingly simple to learn and deploy.
CDN: You must use a content-distribution-network (CDN). Check out Cloudfront. They offer a free tier that works quite well.
Wordpress: This is your real issue. Wordpress is neither easy to build nor easy to maintain. You need multiple plugins to make the site perform. Best you build your own. If you have to be on Wordpress checkout image optimizers, minifiers, and cache plugins. Gumlet, WP Rocket, Shortpixel are quite popular to improve speed.
I ´m trying with Ad Block Plus, Ghostery, Disconect and Self-Destruct Cookies in Firefox.
Can you recommend some tips to stop the tracking?
The "Do Not Track" feature in Firefox is useful for telling sites that you do not want to be tracked. A detailed guide on enabling this feature is available on Firefox's support page : http://mzl.la/WL6fUP .
Besides, if you want an extra level of security, I would suggest you to use the "NoScript" browser extension(https://addons.mozilla.org/en-US/firefox/addon/noscript/). NoScript blocks JavaScript and other executable content on website thus effectively protecting you from tracking codes on websites.
And if you want real privacy use a proxy or VPN. Another good idea is to use the tor browser ( torproject.org/projects/torbrowser.html.en).
We've recently launched a new website http://atlascode.com and since the launch I've been unable to get in-page analytics working on the website. Google also claims that my tracking code is not working but I think this is a misnomer.
Whenever I attempt to load in-page analytics I receive the error:
We've identified problems in your setup. These may cause problems loading In-Page Analytics.
Your site doesn't load ga.js from Google.
If you host the Google tracking code on your own servers, it isn't updated automatically and can miss important changes.
We didn't find a tracking snippet on your site. In-Page Analytics cannot load. Please make sure you have tracking installed correctly. If your snippet is included in a separate JavaScript file, you'll have to manually check it is being loaded correctly.
-ENDS-
I've simply copy and pasted the tracking code on to the website and haven't done anything out of the ordinary. I've also checked to make sure that under Web Property Settings my Web property name and default URL is atlascode.com.
Any ideas you guys have really would be welcomed.
EDIT: Added screenshot of Google Analytics error http://min.us/mdqlrhj
Thanks in advance
Simon
Well there's whole buncha people in the web complaining about the same issue.
I've noticed something funny.
Most of developers love to exclude Analytics tracking code for logged in administrators and trying to check out In-Page Analytics while they're logged in. So there's really no any ga.js.
In my experience, this occurred when I hadn't set my default URL to exactly match the URL set in the profile.
Matt
P.S. Someone beat me to your source!
Got the same problem on Magento Enterprise, but solution was pretty simple: GA code just need to be placed before <head> tags. After this simple fix In-Page tracking works perfectly.
Update
Also, be sure you have no framekiller installed in your site.
My company has multiple vendors that all have their own websites. I am creating a website that acts as a dashboard where customers can access all of the vendor's sites. I wanted to know what is the best option for doing this?
Here's what I have so far:
Iframe
Can bring in the entire website
Seems secure enough (not sure if I'm missing any information on security issues for this)
Users can interact with the vendor's website through our site
Our website cannot fully interact with the vendor's website (Also may be missing info here)
Pulling in the content
Can bring in the entire website
Not very secure from what I hear (Some websites actually say that pulling another website in is a voilation of security and will alert the user of this or something similar...
Users can interact with their website through our site
Our website can fully interact with the vendor's website
Anyone have any other options...?
What are some of the downsides to bringing in a site with an iframe and is this really our only option for doing something like this?
Optimally, we would like to pull in their site to ours without using an iframe- What options do we have on this level? Is there anything better than an iframe?
Please add in as much information as you can about iframes, pulling content, security, and website interactions like this. Anything to add in is appreciated.
Thanks,
Matt
As far as "pulling content" is concerned I wouldn't advise it as it can break. All it takes is a simple HTML change on their end and your bot will break. Also, it's more work than you think to do this for one site, let alone the many that you speak of. However, there are 3rd party apps that can do this for you if you have the budget.
You could use an iframe/frames, however, many sites might try to bust out of them and it can ruin the user experience of the site within the frame.
My advice is to use the following HTML for each link in your dashboard.
Vendor Site Link
If you can have the sites that you are embedding add some client-side script, then you could use easyXSS. It allows for easy transferring of data, and also calling javascript methods across the domain boundry.
I would recommend iFrames. Whilst not the most glamorous of elements, many payment service providers use iFrames for the Verified by Visa/Mastercard Secure Code integration.
I'm trying to come up with ways to speed up my secure web site. Because there are a lot of CSS images that need to be loaded, it can slow down the site since secure resources are not cached to disk by the browser and must be retrieved more often than they really need to.
One thing I was considering is perhaps moving style-based images and javascript libraries to a non-secure sub-domain so that the browser could cache these resources that don't pose a security risk (a gradient isn't exactly sensitive material).
I wanted to see what other people thought about doing something like this. Is this a feasible idea or should I go about optimizing my site in other ways like using CSS sprite-maps, etc. to reduce requests and bandwidth?
Browsers (especially IE) get jumpy about this and alert users that there's mixed content on the page. We tried it and had a couple of users call in to question the security of our site. I wouldn't recommend it. Having users lose their sense of security when using your site is not worth the added speed.
Do not mix content, there is nothing more annoying then having to go and click the yes button on that dialog. I wish IE would let me always select show mixed content sites. As Chris said don't do it.
If you want to optimize your site, there are plenty of ways, if SSL is the only way left buy a hardware accelerator....hmmm if you load an image using http will it be cached if you load it with https? Just a side question that I need to go find out.
Be aware that in IE 7 there are issues with mixing secure and non-secure items on the same page, so this may result in some users not being able to view all the content of your pages properly. Not that I endorse IE 7, but recently I had to look into this issue, and it's a pain to deal with.
This is not advisable at all. The reason browsers give you such trouble about insecure content on secure pages is it exposes information about the current session and leaves you vulnerable to man-in-the-middle attacks. I'll grant there probably isn't much a 3rd party could do to sniff venerable info if the only insecured content is images, but CSS can contain reference to javascript/vbscript via behavior files (IE). If your javascript is served insecurely, there isn't much that can be done to prevent a rouge script scraping your webpage at an inopportune time.
At best, you might be able to get a way with iframing secure content to keep the look and feel. As a consumer I really don't like it, but as a web developer I've had to do that before due to no other pragmatic options. But, frankly, there's just as many if not more defects with that, too, as after all, you're hoping that something doesn't violate the integrity of the insecure content so that it may host the secure content and not some alternate content.
It's just not a great idea from a security perspective.