is this possible convert current page to AMP - magento

I'm using this extension for creating AMP pages for our Magento e-commerce website.
In that extension, they create different pages for AMP (home page, category page, product page). We felt it is an extra work.
is this possible to convert the current page to AMP? (without any modification)

I don't think it's possible without having any modification unless you use some third party apps. You may check this Convert HTML to AMP tutorial. Be noted that it is strongly recommended that you use HTTPS in production environments. HTTPS has several benefits beyond just security including SEO. You can read more about this topic in this Google Webmaster blog post. Also, from this page, if you use WordPress, all you have to do is download the official AMP WordPress plugin.

Related

How to use Google AMP Cache for my AMP Website

How to use or setup a google AMP cache for my amp website. I have seen many site manages their request from google.com server . How it is possible for me?
To make Google serve your pages as AMP-page, the page must be re-written / modified to make it fit the AMP requirements.
AMP-pages are stripped down versions of web pages, with some restrictions compared to ordinary websites. The HTML markup is slightly different, i.e. and there are other restrictions such as CSS limited to 100kb. Those sites are separately hosted and as soon as Google crawled it, it might be included in the AMP mobile search results.
See AMP documentation for the details on how to implement it.

SEO with angularjs and asp.net restfull service

I have developed a website using angularjs and web api.
The problem is that the ajax rendered content is not crawable by google. And no one can find the website using google search.
After reading many articles regarding this issue, including:
This one with all links of explanation going out,
Google ajax crawling protocol, and also stack over flow question, I couldn't find the proper solution. Those that mention asp.net solutions, are talking about mvc, and I need only the simple REST by web api, other articles are not talking about asp.net.
Is there any simple explanation?
I'm the one who asked this same question long ago, so I will answer from my experience:
Firstly, if all your content are accessible via unique URIs (including the hashbang if you use it), modern search engines should index it just fine. In fact Google can index javascript generated content now. You can try that via the Google Webmaster tool and see how your site is indexed.
Secondly, there are libraries that help you to serve parsed content to search engines if you need to, but in my case I didn't bother much with it since Google is indexing js nicely.
I've seen others ask this question, and maybe I'm missing something or this is outdated, but I don't see why AngularJS needs to be an issue with SEO.
Say you have a landing page and it has a bunch of links. Assuming you're using html5 mode in AngularJS (and I'm not sure that's 100% necessary) and something like ng-route then the links on the landing page can work both as "angular" (JavaScript) links and "old school" (full page load) links.
If you're a human user you can click a link and it will do angular magic and adjust the content without loading the full page. Ok, all fine.
But if you instead copy the link and paste it in a new tab or new browser, it will still work - assuming you've set up routes correctly.
I'm not an SEO expert by any stretch of the imagination, but as I understand it, having links that load pages and having those pages have real and useful content is the core of SEO, and done this way, AngularJS should work fine. The key thing to check is if you copy and paste the link (not just click it) that it works.

Google crawl ajax / dynamically generated content - SEO

I've got a very unique situation that I don't believe any of the other topics here can relate.
I have a ecommerce module that is dynamically loaded / embedded into third party sites, no iframe straight JSON to web client into content. I have no access to these third part sites at all, other then my javascript file being loaded from their page and dynamically generating the content.
I'm aware of the #! method, but that's no good here, my JS does generate "urls" within the embedded platform, but they're fake and for the address bar only, and I don't believe google crawlers can reach this far.
So my question is, is there a meta that we can set to point outside the url to i.e. back to my server with static crawlable content. I.e. pointing the canonical to my server... but again I don't think that would work.
If you implement #! then you have to make sure the url your embedded in supports the fragment parameter versions, which you probably can't. It's server side stuff.
You probably can't influence the canonical tag of the page either. It again has to be done server side. Any meta tag you set via JavaScript will not be seen by a bot.
Disqus solved the problem by providing an API so the embedding websites could get there comments server side and render then in plain html. WordPress has a plugin to do this. Disqus are also one of the few systems that Google has worked out how to crawl their AJAX pages.
Some plugins request people to also include a plain link with the JavaScript. Be careful with this as you may break Google Guidelines if you do it wrong. But you may be able to integrate the plain link with your plugin so that it directs bots and users to a crawlable version of the content.
Look into Google's crawlable ajax standard (and why it's a bad idea) and canonical URLs.
Now you can actually do this. A complete guide and examples can be found here: https://github.com/kubrickology/Logical-escaped_fragment

wordpress blog excerpt prestation

I don't know how this page of my web site, takes much time to load:
michelepierri.it/blog
In this page there are post excerpts of blog.
Instead other pages like home page, take less time to load.
What can be causing this?
Thanks a lot.
Plugins I use:
Advanced Code Editor
All in One SEO Pack
Better Related Content
cbnet Twitter Widget
CloudFlare
Contact Form 7
Default Thumbnail Plus
Developer Formatter
Disqus Comment System
Fancybox
Fast Secure Contact Form
FeedBurner FeedSmith Extend
Google Analytics
Google XML Sitemaps
lorem shortcode
NextScripts: Social Networks Auto-Poster
Official StatCounter Plugin
Pingler
Really Simple CAPTCHA
Shareaholic | email, bookmark, share buttons
Simple Skype Status
Single Category Permalink
Skype Online Status
Social Metrics
SyntaxHighlighter Plus
Transposh Filtro per Traduzioni
Trash Manager
W3 Total Cache
WP-Cumulus
WP-o-Matic
WP Facebook Open Graph protocol
WP Minify
WP to Twitter
Youtube shortcode
Why so many plugins? That can actually impact your performance a fair amount. The first request to your site is taking a long time, which actually suggests a server issue (more likely to be an issue if you're on shared hosting with the number of plugins active).

How to improve SEO while using MVC with AJAX

I am developing a site using asp.net MVC. I have used AJAX for paging,sorting dropdown fillup,showing different content on link click etc. I came to know that AJAX call is always against SEO and SEO is most important for my site.
Please suggest me ways to improve SEO of my site without removing AJAX.
You should have been using progressive enhancement from the start. Your site should work without JavaScript/Ajax being enabled. This way all users, including search engines, can get to your content. Then you should enhance your users' experience by adding JavaScript and Ajax to provide a better experience.
At this point you don't have a lot of options. You'll need to go back and make that content available without JavaScript. You can also use Google's crawlable Ajax standard, too, but that only works for Google and for a slightly more amount of work you can make every search engine, and every user, able to reach your content by not making JavaScript required to use your website.

Resources