My site is written in php and has urls like this:
http://mysite.com/index.php?m=apple&f=show&t=hello-world
I want to change it to SEO friendly url.but i don't know which one is better:
http://mysite.com/apple/hello-world
or
http://mysite.com/apple/hello-world.html
would you help me?
I would not include the .html. It doesn't help your users at all. In general, if it is good for users it is good for SEO.
Try to avoid extensions where possible, as they may change over time, and URIs should remain static. Think about old sites that use the .cgi extension, and then migrated to another system, such as PHP. Although HTML is likely to be around for a long time, it too may change.
See Cool URIs don't change for a good introduction.
Related
I have a core php website,in which user can login, update profile, add friends and all.User profile is resides on a link like "mysite.com/userProfile.php?id=####". My client would like to make it SEO friendly, like "mysite.com/justin".
I know we can do it by writing rules in ".htaccess" file. If so, I need to change all the places where user profile link present.Actually this is very big deal since I need to modify nearly 250+ php files.
Is there any short cut to do the SEO friendly URL with out modifying the links in the existing codebase.
Hope this will make sense.
I had same issues! It wouldn't be possible to make it work without 'perversions' (in beautiful way, I mean). Editing 250+ file is not necessary and all you need is to use find/replace function and apply it to the folder. It would be easy and provide you with the result you need.
Make sure to make backups before doing this! Because you will not be able to revert it, well unless you keep all of the files opened.
Why don't you simply find /userProfile.php?id= and replace it with /userName. It will be fast and efficient?
In order to use userName but not userID's you will have to write a small php function to extract userName corresponding to userID.
Other than that you could probably apply tricky solutions but I personally wouldn't recommend this!
So I am attached to this rather annoying project where a clients client is all nit picky about the little things and he's giving my guy hell who is gladly returning the favor by following the good old rule of shoving shi* down the chain of command.
Now my question. The application consists basically of 3 different mini projects. The backend interface for the administrator, backend interface for the client and the frontend for everyone.
I was specifically asked to apply MOD_REWRITE rules to make things SEO friendly. That was the ultimate aim, so this was basically an exercise in making things more search friendly rather than making the links aesthetically better looking.
So I worked on the frontend, which is basically the landing page for everyone. It looks beautiful, the links are at worst followed by one backslash.
My clients issue. He wants to know why the backend interfaces for the admin and user are still displaying those gigantic ugly links. And these are very very ugly links, I am talking three to four backslashes followed by various get sequences and what not, so you can probably understand the complexities behind MOD_REWRITING something such as this.
In the spur of the moment I said that I left it the way it was to make sure the backend interface wouldn't be sniffed up by any crawlers.
But I am not sure if that's necessarily true. Where do crawlers stop? When do they give up on trying to parse links? I know I can use a .robot file to specify rules. But, as indigenous creatures, what are their instincts?
I know this is more of a rant than anything and I am running a very high risk of having my first question rejected :| But hey, it feels good to have this off my chest.
Cheers!
Where do crawlers stop? When do they give up on trying to parse links?
Robots.txt does not work for all bots.
You can use basic authentication or limited access by IP to hide back-end, if no files are needed for front-end.
If not practicable, try to send 404 or 401 headers for back-end files. But this is just an idea, no guarantee.
But, as indigenous creatures, what are their instincts?
Hyperlinks, toolbars and browser-sided, pre-activated functions for malware-, spam- and fraud-warnings...
I am building an intensive web application and currently all my URLs at the moment are in page.php?action=string format. Don't worry, we have a fall back plan to change all pages quickly to the SEO URLs via a config file.
I want to know two things. What script is running this site:
http://lookbook.nu/ (also http://stackoverflow.com)
If you just look at it, hover over areas, crazy ajax calls, so many subdomain calls, so many clean URLs. What would be the best approach to do this - is this a RoR thing? All the URLs are so clean and structured. It really impressed me.
I am not wishing for a htaccess solution as I am using nginx.
StackOverflow actually runs on ASP.NET MVC, but you have URL rewriting built in Apache too if that's your thing. No clue about nginx, though.
Edit: A simple Google search revealed http://wiki.nginx.org/HttpRewriteModule so you're in luck!
Hi guys just want to ask this after hours of mod_rewrite frustration and reading tons of
questions about it on stackoverflow because i tried everything and it didn't work with me. I don't know why, but i had enough so i searched about alternative and am asking here today for opinions. I came up with the following method.method
First assume I have this URL
http://www.domain.com/articles/6
and I have a articles.php page that will take the ID from this URL and pull the article
content from the database (mod_rewrite fails in here), so this is a little solution:
$article_id=explode("/",$_SERVER["REQUEST_URI"]);
show_article($article_id[3]);
the show_article() function will simply take the id and query the database for the article content and I read that the server will not understand that articles is a php page so a little solution too
<FilesMatch "^articles$">
< ForceType application/x-httpd-php >
</FilesMatch>
so two questions :
1- will this solution affect indexing my website pages from search engines spiders ?
2- is this a good solution or mod_rewrite is better?
Note:am sorry if the question not will formatted am not good in formatting if you can make it look better i will appreciate it really sorry
Don't give up with mod_rewrite, it's a bit non-intuitive but VERY powerful and useful piece of software! You'll never get so clean solution in application regarding URL manipulation. To your question:
1) no, it'll not affect indexing. Both your solution and the one involving mod_rewrite are for web spiders transparent,
2) mod_rewrite is definitely better
I do recommend you to ask question regarding your problems with mod_rewrite not doing what you want. I'm pretty sure you'll sort it out with someone.
I'ld like to prevent direct-linking to .zip files I offer for download on my website.
I'm reading posts for hours now but I'm not sure which method is the best to achieve that. PHP seems not to be safe and htaccess refferer can be empty etc.
Which method do you guys use or would suggest?
Cheers
See: http://www.alistapart.com/articles/hotlinking/
and: http://www.webmasterworld.com/forum92/2787.htm
Referrer checking is one option, but as you noted they can be empty or spoofed.
Another possibility is to set a cookie when someone visits normal pages on your site, and check for that when the person tries to download the zip file. This could be gotten around (e.g. by the hot-linker embedding an appropriate cookie-setter page as a 1x1 image along size the hot link), but it's less likely they'll figure it out. It'll also exclude people who block cookies, of course.
Another possibility is to generate limited-time-access URLs on the download page, something along the lines of http://example.com/download.php?file=file.zip&code=some-random-string-here. The link would only be usable for a small number of downloads and/or a short period of time, after which it would no longer function.