I created health-related site based on Joomla.
I created Google Webmasters account and saw how Google indexed my site.
But then I saw that Google indexed some directory /includes (that I didn't create) and which content was far from health topics.
I deleted this directory, but already for 2 months Google Webmaster shows top keywords from /includes directory.
What should I do to tell Google that directory doesn't exist anymore?
Thank you in advance. In case you need, here is address of site: http://healthfount.com
You need to create a robots.txt file for your site.
see here
Related
our 2 products such as www.nbook.in and www.a4auto.com are removed from all search engines. These projects got sub-domains and the links from subdomains are available. The domain is not blacklisted. The custom sitemap is created and the same is getting indexed even now. Analyzed the URL by google search console and it seems fine. 'site:nbook.in' in google didn't produce any result. Actually, we got more than 75,000 links available on google. it was working fine till last week. This is affecting our product's Alexa rank and reach. There is no issue with robots.txt file in the server document root folder. There is no rule set to deny any bot or user agent. It is not just Google, all search engines removed our links and this is making me confuse. We have other products which is designed and developed on the same way but there is absolutely no issue. I think there is nothing more to do with google search console/webmaster. I'm ready to provide more data upon requirement. Please help.
create robots.txt on root and put this code
User-agent: *
Disallow: /
I'm currently trying to upload a html+css page to my hosting service (1&1 internet).
Unfortunately, I have very little experience in that; and would like to ask this question:
How is it possible to update the existing homepage on a site that already has been uploaded to the web?
I'm using WebspaceExplorer. Now, when I enter it, (Hosting section) there are a lot of folders and files with a few options like download, delete and so on.
I can't seem to find out how to make the program understand which one should be the homepage. This is because I see that the page I uploaded is online and accessible, but is not the first page the user sees! How can I change it?
It may be confusing because of the poor description so here is the url of both:
velten-berlin.org (current Homepage)
velten-berlin.org/Upload (desired homepage)
Some weeks ago, we discovered someone going on our site with the robots.txt directory:
http://www.ourdomain.com/robots.txt
I've been doing some research and it said that robots.txt makes the permissions of our search engine?
I'm not certain of that...
The reason why I'm asking this is because he is trying to get into that file once again today...
The thing is that we do not have this file on our website... So why is someone trying to access that file? Is it dangerous? Should we be worried?
We have tracked the IP address and it says the location is in Texas, and some weeks ago, it was in Venezuela... Is he using a VPN? Is this a bot?
Can someone explain what this file does and why he is trying to access it?
In a robots.txt (a simple text file) you can specify which URLs of your site should not be crawled by bots (like search engine crawlers).
The location of this file is fixed so that bots always know where to find the rules: the file named robots.txt has to be placed in the document root of your host. For example, when your site is http://example.com/blog, the robots.txt must be accessible from http://example.com/robots.txt.
Polite bots will always check this file before trying to access your pages; impolite bots will ignore it.
If you don’t provide a robots.txt, polite bots assume that they are allowed to crawl everything. To get rid of the 404s, use this robots.txt (which says the same: all bots are allowed to crawl everything):
User-agent: *
Disallow:
First I created sitemap.xml file for my website and upload this file into the domain root folder of my site and Second I verified my website on webmaster of Google.
But sitemap is not showing on my website. Please give me reason for this.
Sitemap couldn't be displayed from xml file.
You need to install a component to help you create it. A very good one is: Xmap
I have been developing a web page named: directorioelectronico.com and I have specially issues now, I will be very grateful that someone can be help me.
The web page has been submitted in Google.com and now all the links that are in the homepage are listed in search results BUT some links ej. google.com/maps are not list in the home page (because it appears only when you select your municipality). How can I tell to Google that it exists (maybe without a sitemap.xlm because my links have metadata that is very important that google knows?
In advance, Thank you very much for you help.
My solution was that I create a route /sitemap.xml and put it in robots.txt too I created an html page with all the sitemap with cities, companies and all imoportant links.