How can I see all files and subdirectiories at http://www.anywebpage.com/directory. When I visit some page like http://www.anywebpage.com/directory, it shows me only index.html but I would like to see tree hierarchy - files and subdirectories. Is it possible?
If you mean:
Can I request a directory listing from arbitrary third party web servers?
Then the answer is a resounding no
If you mean:
Can I configure my web-server to serve up a directory listing but also have a default index page that is not that listing?
Then it is possible, but I'm not aware of any web server that has such a feature built in, so you will need to write a (or find a third party) script that will give you those listings when you hit a different URL (that you choose).
When you enter a directory, you go to the index file automatically.
If there is no index file, You will most likely see all the files in the directory.
Related
I am trying to use AJAX to return high-level calculation results. However, it appears as though the PHP file that I am pointing the AJAX call to is in the '/' root directory. Here is a snapshot of how my structure is: c:\webserver\test\webroot (this is the root directory), and then c:\webserver\test\code is where the HTML and PHP files are stored - looking to use an AJAX call to point to a PHP file in the non-root folder (which is one folder up, then one folder down). I tried a few different things, such as '../code' to move up and then over, but that doesn't work. Any suggestions?
Referencing a file location via relative paths should work - you'll just need to be sure that you are navigating FROM the correct folder TO the correct folder.
This can be a bit tricky if your javascript file is located in another folder - if memory serves you must navigate from the folder containing the javascript file, not from the folder containing the index file.
Alternately, you can use FQDM http://example.com/folder/folder/file.php to reference the file. Note that if you are on localhost, you can use a hosts file to fake-out the webserver.
I'm working for a simple bot for a project, and I noticed, that a lot of sites do not have sitemaps in their robot.txt files. There is of course an option to simply index the sites in question and crawl all possible pages, but that often takes much more time than simply downloading sitemap.
What is the best way to detect sitemap if it is not mentioned in robots.txt?
Normally it should be placed in the root directory of a domain like xydomain.xyz/sitemap.xml .
I would only add the site map into the robots file, if it is placed elsewhere. If a site uses more than one site map located on another place, it should be noted in an index map.
You can use this online tool to scan your site and create a bespoke sitemap.xlm file for your site.
To help your sitemap to be discovered through the robot.txt add the URL of your sitemap at the very top of your robot.txt file, (see below example).
So, the robots.txt file looks like this:
Sitemap: http://www.example.com/sitemap.xml
User-agent:*
Disallow:
I want to prevent users access for my "~/Content/..." folder I wrote it as follow in "Global.asax.cs" and put this line of code at the top of every other routes
routes.IgnoreRoute("Content/{*pathInfo}");
but it does not work. in fact user can see every files in content folder by type the URL in browser.
am I missing something?
How did you figure out that it does not work? Give example.
You may have put it last in the Routing table. So try to move it up so that it gets added to the routing table first. The route collection is an ordered list of routes.
Also try this : Routes.IgnoreRoute("Content/");, but your version of ignore is also correct and it should work.
Lastly, I do not know what you mean when you say the user can see all the contents of the Content folder : Isn't that the point? User must be able to download files from the folder, and we usually just need MVC to ignore the requests from coming into the framework, and so that IIS can directly serve those files.
or did you mean Directory browsing is enabled, and you want to disable that : In that case go to IIS manager, and select your website and look for the Directory browsing option and disable it as shown here.
Your problem cannot be solved by routing constraints. There are 3 significant steps in processing request:
IIS got request.
IIS watch at filesystem and search for direct correspondence to file
If IIS didn't found any file - it gives request to ASP.NET MVC for processing.
So, you need to configure folder security to forbidden direct access to files, but allow access to application, as here.
But I don't recommend to secure folder, that should be shared. I don't believe that your site shouldn't have images to display :) If you have some secured content, you need to create another folder.
Is there any way I could make my web links to point to local files (relative paths)?
Once a month I upload a new file to the server and than I would like to add a new "web link" of an specified category (or what else...) to this file. Than, I have a WebLinks module listing the last 5 added files in the main page.
The is that WebLinks doesn't allow me to set a url as a relative path and use absolute path is not possible.
Any sugestions?
There are several extensions meant for handling files. Depending on what you are trying to do, any ones of these may work for what you need -
Phoca download - http://extensions.joomla.org/extensions/directory-a-documentation/downloads/5551
jDownloads - http://extensions.joomla.org/extensions/directory-a-documentation/downloads/2849
Docman - http://extensions.joomla.org/extensions/directory-a-documentation/downloads/10958
Remository - http://extensions.joomla.org/extensions/directory-a-documentation/downloads/83
I want to have a sitemap structure where the sitempasindex file is located in the root path (example.com/sitemaps.xml) and it references several sitemap[n].xml files located in a folder (example.com/static/sitemap1.xml). Those sitemap[n].xml files link to webpages that are in the rooth path (like example.com/helloworld.html).
Is that posible? I'm asking because I know that if the sitemap.xml file is placed within a folder, it can only contain webpages that are under that folder.
Thanks!
I believe you easily have
example.com/sitemap-index.xml
point to e.g.
example.com/sub1/sitemap.xml
and
example.com/sub2/sitemap.xml
however each sitemap.xml should only contains URLs within each their subfolder. (From your question, it seems you have those sitemap.xml files link to paths in root. I doubt that works, but you could try run a small test and submit to Google. If no errors then...)
The location of a Sitemap file determines the set of URLs that can be included in that Sitemap. A Sitemap file located at http://example.com/catalog/sitemap.xml can include any URLs starting with http://example.com/catalog/ but can not include URLs starting with http://example.com/images/.
From google perspective, they should be available on main root of the website. http://example.com/sitemap.xml, When you submit it through the subdir in webmaster tool "http://example.com/catalog/sitemap.xml" google won't crawl it and always showing us pending index status.