I don't know how this page of my web site, takes much time to load:
michelepierri.it/blog
In this page there are post excerpts of blog.
Instead other pages like home page, take less time to load.
What can be causing this?
Thanks a lot.
Plugins I use:
Advanced Code Editor
All in One SEO Pack
Better Related Content
cbnet Twitter Widget
CloudFlare
Contact Form 7
Default Thumbnail Plus
Developer Formatter
Disqus Comment System
Fancybox
Fast Secure Contact Form
FeedBurner FeedSmith Extend
Google Analytics
Google XML Sitemaps
lorem shortcode
NextScripts: Social Networks Auto-Poster
Official StatCounter Plugin
Pingler
Really Simple CAPTCHA
Shareaholic | email, bookmark, share buttons
Simple Skype Status
Single Category Permalink
Skype Online Status
Social Metrics
SyntaxHighlighter Plus
Transposh Filtro per Traduzioni
Trash Manager
W3 Total Cache
WP-Cumulus
WP-o-Matic
WP Facebook Open Graph protocol
WP Minify
WP to Twitter
Youtube shortcode
Why so many plugins? That can actually impact your performance a fair amount. The first request to your site is taking a long time, which actually suggests a server issue (more likely to be an issue if you're on shared hosting with the number of plugins active).
Related
I'm using this extension for creating AMP pages for our Magento e-commerce website.
In that extension, they create different pages for AMP (home page, category page, product page). We felt it is an extra work.
is this possible to convert the current page to AMP? (without any modification)
I don't think it's possible without having any modification unless you use some third party apps. You may check this Convert HTML to AMP tutorial. Be noted that it is strongly recommended that you use HTTPS in production environments. HTTPS has several benefits beyond just security including SEO. You can read more about this topic in this Google Webmaster blog post. Also, from this page, if you use WordPress, all you have to do is download the official AMP WordPress plugin.
I have developed a website using angularjs and web api.
The problem is that the ajax rendered content is not crawable by google. And no one can find the website using google search.
After reading many articles regarding this issue, including:
This one with all links of explanation going out,
Google ajax crawling protocol, and also stack over flow question, I couldn't find the proper solution. Those that mention asp.net solutions, are talking about mvc, and I need only the simple REST by web api, other articles are not talking about asp.net.
Is there any simple explanation?
I'm the one who asked this same question long ago, so I will answer from my experience:
Firstly, if all your content are accessible via unique URIs (including the hashbang if you use it), modern search engines should index it just fine. In fact Google can index javascript generated content now. You can try that via the Google Webmaster tool and see how your site is indexed.
Secondly, there are libraries that help you to serve parsed content to search engines if you need to, but in my case I didn't bother much with it since Google is indexing js nicely.
I've seen others ask this question, and maybe I'm missing something or this is outdated, but I don't see why AngularJS needs to be an issue with SEO.
Say you have a landing page and it has a bunch of links. Assuming you're using html5 mode in AngularJS (and I'm not sure that's 100% necessary) and something like ng-route then the links on the landing page can work both as "angular" (JavaScript) links and "old school" (full page load) links.
If you're a human user you can click a link and it will do angular magic and adjust the content without loading the full page. Ok, all fine.
But if you instead copy the link and paste it in a new tab or new browser, it will still work - assuming you've set up routes correctly.
I'm not an SEO expert by any stretch of the imagination, but as I understand it, having links that load pages and having those pages have real and useful content is the core of SEO, and done this way, AngularJS should work fine. The key thing to check is if you copy and paste the link (not just click it) that it works.
Typically, if it were a regular page-by-page website, I would install the analytics javascript before the body tag.
But, with a site where content is on overlays, how can analytics be installed to track views? (i.e. a one page portfolio site)
Thanks for any insight!
See Tracking Google Analytics Page Views with Angular.js. Even though I'm not sure if you are using something like Angular or just straight javascript, you could use a similar technique described there with hash urls that are set when a user clicks on a different part of the page, that way you could track how a user interacts with your single page site by making different urls for their interactions.
For more information see Pushing Functions onto the Queue.
In javascript that calls the overlay, you can add:
_gaq.push(['_trackPageview', '/url/of/page']);
or:
ga('send','pageview','/url/of/page');
My Problem:
My client site which displays more products and it adds more page load/weight. So i decided to use ajax more products loading and it works well. But here it affects the seo - and no products or deals has been indexed(Even i suggest the client to submit product via googlebase but client doesnot like that idea and he wants direct google crawling into site also he wants less time page load).
Question:
Can we identify the googlebot crawling request to the server or mozila like browser user agent request to the site(server).
Suggestion I have
I tried to identify user agent from requests but that doesnot working(or i might missing something?) Please anyone have correct solution for this problem to reduce the page load time using ajax and get googlebot also to crawl the website.
You should just search stackoverflow for "Google AJAX SEO". There are a number of questions around this.
In short, Google has a specification to make AJAX sites crawlable: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started?hl=sv-SE
You can also look into PushState as an SEO option as well.
One tactic that is used to solve this is to harness the pagination function of whatever framework or CMS you are using. You load one page of content and display pagination links in your view then use JavaScript to hide the pagination links and fetch the content of the linked pagination page via Ajax and append it to the current page. Take a look at how infinite-scroll works for inspiration:
http://www.infinite-scroll.com/
Basically you need to be at least loading links to pages that contain the other content so that search engines can crawl the content, but you can hide these links for the users who have JavaScript Enabled.
But to better answer your question, it is possible to redirect robots using htaccess:
redirect all bots using htaccess apache
But it is better SEO, as far as I understand it, to have the content or links to it, actually available on the page.
I've read about Google and Google's possibility to crawl ajax pages using hash followed by a exclamation mark in the url, #!
( http://code.google.com/web/ajaxcrawling/docs/getting-started.html )
I have changed my website and I was wondering about Google Analytics. How do I track my visitors ajax requests?
Thank you in advance!
Google Analytics has a _trackPageview function that you can call when you load content via AJAX to record a pageview. There's details in this Google Analytics Help topic and undoubtedly more in the GA docs.
Google just deprecated the need for the #! scheme.
http://googlewebmastercentral.blogspot.com/2015/10/deprecating-our-ajax-crawling-scheme.html
They state:
Today, as long as you're not blocking Googlebot from crawling your JavaScript or CSS files, we are generally able to render and understand your web pages like modern browsers.
So now you wont really need to track anything different in Analytics!