I have a site that ask for Location permission in the first moment, with that information, reload the page for set it. When I use PageSpeed Insights (https://pagespeed.web.dev/) for measure performance it throws the next error of redirect:
We encountered an additional client-side redirect when loading the
page. To see more accurate real user experience analysis and
performance metrics, we suggest re-running with the updated URL.
Updated URL
chrome-error://chromewebdata/
and doesn't complete the measurement. Instead, when I perform the analysis with Lighthouse extension it works perfectly.
What makes the analyzes and their results different from the page (PageSpeed) and the extension (Lighthouse)?
Related
I am getting 100 sessions on /error page on my all of three website. As it is coming from "not set" so unable to check the path of that traffic.
Is there any spam or how can I add any kind of filter when I don't have idea where the sessions are coming.
Please help.
error in analytics
Do you have the bot filtering enabled in your view? We find that it is not perfect but it's a good start.
Perhaps you can look at these request and see there is a set pattern; we have seen spam traffic coming from OS = Linux when typically our site visitors are mostly windows and mac. We know those are spams for sure; these requests are from weird, probably automatically generated screen size. Check city / province / OS etc of these sessions to see if there is a set pattern.
Do you have a view specific for reporting? If not, it will be a good practice to create a filtered view so that you can apply these filtering logics as needed. The idea is to keep an unfiltered view for diagnosis, and a filtered view for analysis and reporting. In the new view, you can create exclude filters to block the patterns. Make sure your filter is specific enough so that it won't block more than it should.
Check if your site has any soft or hard redirect for those pages; it could be why the error page does not capture any additional information. (It happens a lot for 404 redirects)
First off, I know this has been discussed over and over again. But let's take this as a "late 2012 edition" since things tend to change rapidly on the internet.
I have this web page which is a "classical" web page with full page refreshes. Every internal click produces new content. We can show AdSense ads this way without a problem.
Now I started looking into "ajaxifying" (PJAX) the whole page for performance reasons (I've actually made a prototype version and it works superbly). The whole thing works only on browsers that support history.pushState, and whenever a user clicks on a internal link a AJAX request is triggered that fetches only the content part of the page (everything between the header and footer) and replaces old content with it.
The end result is, that the user is presented with a brand new page (including the changed URL and what not) and only the mechanism for delivering the page has changed (full reload vs. AJAX). As far as google (and older browsers) is concerned this is still a regular page with regular links (progressive enhancement and all that).
And yet there isn't a way to display AdSense, what with the document.write's and AdSense's TOS ruining the party.
My question: is there a Google approved (I'm not interested in hacks that will get us banned) way to display AdSense ads on a page like this (and I haven't found it). Or if there isn't, does Google have any plans on supporting this in the future (again, I haven't found anything related to this).
update
After some more digging around I came across Google DFP, which seems to support async loading of adds. But, I'm not sure I can load AdSense ads through it dynamically without breaking the TOS. I'm 100% sure I can load other ads this way, but not for AdSense. Could somebody clear this up for me?
According to this page loading Adsense ads through DFP you are subject to the both the DFP and Adsense terms. So I guess if you are following the current Adsense terms you are not allowed to do what you are talking about... at the same time Google provides a rather easy method to do exactly what you want to do with DFP...
Its still a grey area...
My Problem:
My client site which displays more products and it adds more page load/weight. So i decided to use ajax more products loading and it works well. But here it affects the seo - and no products or deals has been indexed(Even i suggest the client to submit product via googlebase but client doesnot like that idea and he wants direct google crawling into site also he wants less time page load).
Question:
Can we identify the googlebot crawling request to the server or mozila like browser user agent request to the site(server).
Suggestion I have
I tried to identify user agent from requests but that doesnot working(or i might missing something?) Please anyone have correct solution for this problem to reduce the page load time using ajax and get googlebot also to crawl the website.
You should just search stackoverflow for "Google AJAX SEO". There are a number of questions around this.
In short, Google has a specification to make AJAX sites crawlable: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started?hl=sv-SE
You can also look into PushState as an SEO option as well.
One tactic that is used to solve this is to harness the pagination function of whatever framework or CMS you are using. You load one page of content and display pagination links in your view then use JavaScript to hide the pagination links and fetch the content of the linked pagination page via Ajax and append it to the current page. Take a look at how infinite-scroll works for inspiration:
http://www.infinite-scroll.com/
Basically you need to be at least loading links to pages that contain the other content so that search engines can crawl the content, but you can hide these links for the users who have JavaScript Enabled.
But to better answer your question, it is possible to redirect robots using htaccess:
redirect all bots using htaccess apache
But it is better SEO, as far as I understand it, to have the content or links to it, actually available on the page.
Well, I'm trying to optimize my application and currently using page speed for this. One of the strongest recommendations was that I needed to leverage browser caching. The report sent me to this page:
http://code.google.com/intl/pt-BR/speed/page-speed/docs/caching.html#LeverageBrowserCaching
In this page there is this quote:
If the Last-Modified date is
sufficiently far enough in the past,
chances are the browser won't refetch
it.
My point is: it doesn't matter the value I set to the Last-Modified header (I tried 10 years past), when I access and reload my application (always clearing the browser recent history) I get status 200 for the first access, and 304 for the reaming ones.
Is there any way I can get the behavior described in the google documentation? I mean, the browser don't try to fetch the static resources from my site?
You might have better success using the Expires header (also listed on that Google doc link).
Also keep in mind that all of these caching-related headers are hints or suggestions for browsers to follow. Different browsers can behave differently.
The method of testing is a good example. In you case you mentioned getting status 304 for remaining requests, but are you getting those by doing a manual browser refresh? Browsers will usually do a request in that case.
I use the Kohana3's Profiler class and its profiler/stats template to time my website. In a very clean page (no AJAX, no jQuery etc, only load a template and show some text message, no database access), it shows the request time is 0.070682 s("Requests" item in the "profiler/stats" template). Then I use two microtime() to time the duration from the first line of the index.php to the last line of index.php, it shows almost very fast result. (0.12622809410095 s). Very nice result.
But if i time the request time from the browser's point of view, it's totally different. I use Firefox + Temper data add-on, it shows the duration of the request is 3.345sec! And I noticed that from the time I click the link to enter the website (firefox starts the animated loading icon), to when the browser finish its work(the icon animation stops), it really takes 3-4 seconds!!
In my another website which is built with WikkaWiki, the time measured by Temper Data is only 2190ms - 2432ms, including several access to mysql database.
I tried a clean installation of kohana, and the default plain hello-world page also loads 3025ms.
All the website i mentioned here are tested in the same "localhost" PC, same setting. Actually they are just hosted in different directories in the same machine. Only Database module is enabled in the bootstrap.php for kohana website.
I'm wondering why the kohana website's overall response is such slow while the php code execution time is just 0.126 second?? Are there anything I should look into?
==Edit for additional information ==
Test result on standard phpinfo() page is 1100-1200ms (Temper data)
Profiler shows you execution time from Kohana initialization to Profiler render call. So, its not a full Kohana time. Some kind of actions (Kohana::shutdown_handler(), Session::_destroy() etc) may take a long time.
Since your post confirms Kohana is finishing in a 1/10th of a second and less, it's probably something else:
Have you tested something else other than Kohana? It sounds like the server is at fault, but you can't be sure unless you compare the response times with something else. Try a HTML and pure PHP page.
The firefox profiler could be taking external media into consideration. So if you have a slow connection and you load Google Analytics, then that could be another problem.
Maybe there is something related with this issue: Firefox and Chrome slow on localhost; known fix doesn't work on Windows 7
Although the issue happens in Windows 7, maybe it can help...