Google not indexed url - sitemap

Hello i have personal site and about 1 month ago i rebuild the complete site. I sent a new sitemap.xml file and is not indexed yet, but im having 404 crawler errors with the old url.
Google said the sitemap is correct,so, any idea, i must do something, or just wait longer?
Is not really important because is just a personal site, but i`m just curious about what is that happening.
Sorry for my bad english, but im spanish and thanks in advance

this just takes time.
I experienced some speed-up in that progress by using other google services as places or analytics. but to answer your question:
If your sitemap has been detected correctly it will work but this might take some time.

Related

Robots.txt in Magento Websites

I have recently started working with a company that has a Magento eCommerce website.
We spotted that the traffic dipped considerably in May, and also the ranking on Google.
When I started investigating i saw that the pages of the ES website were not appearing on Screaming Frog
Only the homepage showed and status said blocked by robots.txt
I said this to my developer and they said they would move the robot.txt file to the /pub folder.
but would that not mean the file was in two places.. would this be an issue?
The developer has gone ahead and done this, how long should it take to see is screaming frog is indexing the pages.
Any Magento developers that could help with advise on this?
Thanks
Neo
There is a documentation page for how to manage robots.txt with Magento 2.x.
And you can use this to allow all traffic to your site:
User-agent:*
Disallow:
Regarding the Googlebot crawl rate, here is some explanation on it.
According to Google, “crawling and indexing are processes which can take some time and which rely on many factors.” It’s estimated that it takes anywhere between a few days to four weeks before Googlebots index a new site. If you have an older website and doesn’t experience crawling, the design of the site may be a problem. Sometimes, sites are temporarily unavailable when Google attempted to crawl, so checking the crawl stats and looking at errors can help you make changes to get your site crawled. Google also has two different crawlbots: a desktop crawler to simulate a user on a desktop, and a mobile crawler to simulate a device search.

How to do the performance measurement of website manually by coding?? Any Idea

I have to develop the application to check the performance of the website ..
I need some guidance about how can I do it with coding.I have searched a lot but google is giving only tools for doing that. anyone have any Idea about this?..suggest me some path where I should work upon.Help will be appreciated.Waiting for your responses.
I have seen manny sites are giving results on the basis of the pages of our websites so they might be getting the pages from the sitemap,that is my point of view I don't know the rest.I think they might be getting the page and logging the load time and response time that is what I have thought but still I am not Knowing how log that things ..I need your guidance to proceed further Thank you.

Can't scrape particular websites using Scrapinghub

I am using the autoscraping feature in the scrapinghub service.
While building and deploying the autoscraper, I found that the site I wanted to scrape would never return any Requests, and would time out around 3.5 minutes.
So, I began reading the documentation to see if I could figure out why this was happening (How to check if site is suitable for autoscraping).
I followed the steps and temporarily removed Javascript from my browser (chrome) and found that I had no problems viewing the site I wanted to scrape.
My question is, at risk of sounding vague, what might be some other reasons that a site is not scrapeable, aside from Javascript? Are there some other ideas regarding how to diagnose a problem like this?

Google webmaster tools: sitemaps submitted every day (?)

having sent normally the first time my sitemap.xml through webmaster tools, I notice every day submitted url's plots (beside indexed ones under optimisation->sitemaps menu) without doing anything from my own. I use drupal7 with sitemap module (http://drupal.org/project/xmlsitemap) and there's no automated tasks enabled.
Does it mean that url's are submitted "internally" by google every day? Or there's something wrong that I need to resolve?
Many thanks for help.
Google will remember any sitemaps you submit and their crawler will automatically download those and associated resources more or less whenever it feels like doing so. This is usually reflected in your Webmaster Tools. In all likelihood it'll even do so without you entering your sitemap on their website if your site gets linked to. Same goes for pretty much any other bot and crawler out in the wild.
No need to worry, everything is doing what it's supposed to. It's a Good Thing(tm) when Google crawls your site frequently :).

Google in-page analytics doesn't work in my ASP.NET MVC 3 Razor website

We've recently launched a new website http://atlascode.com and since the launch I've been unable to get in-page analytics working on the website. Google also claims that my tracking code is not working but I think this is a misnomer.
Whenever I attempt to load in-page analytics I receive the error:
We've identified problems in your setup. These may cause problems loading In-Page Analytics.
Your site doesn't load ga.js from Google.
If you host the Google tracking code on your own servers, it isn't updated automatically and can miss important changes.
We didn't find a tracking snippet on your site. In-Page Analytics cannot load. Please make sure you have tracking installed correctly. If your snippet is included in a separate JavaScript file, you'll have to manually check it is being loaded correctly.
-ENDS-
I've simply copy and pasted the tracking code on to the website and haven't done anything out of the ordinary. I've also checked to make sure that under Web Property Settings my Web property name and default URL is atlascode.com.
Any ideas you guys have really would be welcomed.
EDIT: Added screenshot of Google Analytics error http://min.us/mdqlrhj
Thanks in advance
Simon
Well there's whole buncha people in the web complaining about the same issue.
I've noticed something funny.
Most of developers love to exclude Analytics tracking code for logged in administrators and trying to check out In-Page Analytics while they're logged in. So there's really no any ga.js.
In my experience, this occurred when I hadn't set my default URL to exactly match the URL set in the profile.
Matt
P.S. Someone beat me to your source!
Got the same problem on Magento Enterprise, but solution was pretty simple: GA code just need to be placed before <head> tags. After this simple fix In-Page tracking works perfectly.
Update
Also, be sure you have no framekiller installed in your site.

Resources