WWW or not WWW, what to choose as primary site name? [closed] - domain-name

Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 11 years ago.
Improve this question
From technical perspective the only issue is traffic and incoming links (one of them should redirect to another).
Now I need to choose which one should be primary. Some sites have www (google, microsoft, ruby-lang) and some without www (stackoverflow, github). Seems to me the newer do not use WWW.
What to choose?
Please with some explanations.
UPDATE: This is programming related question. Actually site is for programmers, so I expect to see what techy people think.
UPDATE: Site without WWW is clear winner. Thank you guys!

It doesn't matter which you choose but you should pick one and be consistent. It is more a matter of style but it is important to note that search engines consider these two URLs to be different sites:
http://www.example.com
http://example.com
So whichever you choose for aesthetic reasons should be consistently used for SEO reasons.
Edit: My personal opinion is to forgo the www as it feels archaic to me. I also like shorter URLs. If it were up to me I would redirect all traffic from www.example.com to example.com.

Don't use WWW. It's an unnecessary tongue-twister, and a pain in the arse for graphic designers.

There are some issues you should consider. See for example Use Cookie-free Domains for Components for a cookie validity issue.
But regardless of how you decide: Use just one of that domains as your canonical domain name and use a 301 redirect to correct the invalid. For an Apache webserver, you can use mod_rewrite to do that.

Configure both, obviously. I would make the www redirect to the normal URL, as it only exists to make the people who habitually type it at the beginning of every address happy anyway. Just don't, whatever you do, require the www to be typed manually. Ever.

It depends on your audience, I think. A non-technical audience will assume that the www is there, whereas a technical audience will not instinctively expect it, and will appreciate the shorter URLs.
(Edit: I recently set up a domain for my family to use, including webmail. My wife asked what the address of the webmail was. I said "mail.ourdomain.com". She typed "www.mail.ourdomain.com".)
Either way, make sure the one you don't use cleanly does a 301 Redirect to the one you do use - then neither users nor search engines will need to care.

One aspect of this question deals with CDNs and some web hosts (eg. Google Sites). Such hosts require that you add a CNAME record for your site name that points to the host servers. However, due to the way DNS is designed, CNAME records cannot coexist with other records for the same name, such as NS or SOA records. So, you cannot add a CNAME for your example.com name, and must instead add the CNAME for a subdomain. Of course people normally choose "www" for their subdomain.
Despite this technical limitation, I prefer to omit the www on my sites where possible.

I'd redirect to without www. In Apache 2.x:
RewriteEngine On
RewriteBase /
RewriteCond %{HTTP_HOST} ^www\.yourdomain\.com$
RewriteRule (.*) http://yourdomain.com/$1 [R=Permanent]
I think the www is meaningless; we all know we're on the world wide web. It would be much better to use subdomains for load balancing or for device specific sites (like m.google.com for mobiles, for example, even though there is a .mobi top level domain now).

www is used as a standard sub domain, subfolder for websites in the main domain.
http://no-www.org/ are trying to get it deprecated.
Although http://www.w3.org/ include www.
Worth checking both those sites.
It seems to be become a matter of taste and a religion issue at the moment rather than a standard. Whatever you choose, make sure you register or redirect from www as Control+enter etc. shortcuts copy in www.

Would you have other subdomains? If so, that may make using the www make more sense to my mind as some places may have various subdomains used for other purposes like a store or internationalization subdomains.

I normally go with www.sitename.com because it is explicit that it is the main part of your site. Testing.sitename.com is testing. House.sitename.com is my home PC. I like be explicit however I do not mind when sites do not use www. I am not a purest. :)

Use without the www. The general rationale behind this is that since you are writing an address to a web browser, it's already implicit that you are accessing a web site (what else would you do with a browser?) - using the extra www is therefore useless.
To be specific, when receiving a http request, you know the user wants to access the website. The web browser adds the http://-header implicitly, so user only needs to worry about the address. Same goes to other services as well - if you host ftp, it should be enough to point the ftp client to the domain without the ftp. -prefix.
If I understand correctly, the reasons for using the different www., ftp., etc. subdomains are mostly historical, and are no longer relevant these days since traffic is simply directed to the correct server/service - the redundant prefixes have just stuck because of their popularity.

I always make the non-www one redirect to www and refer to them as www.mysite; Think about various forums and instant messenging apps that correctly convert links only when they begin with www. .

You want your url to be memorable, and you want Google et al to register the same url for rankings and the like.
Best practice appears to be to handle the www, but always HTTP redirect it to a non-www variant. That way the search engines know to rank links to both variants as the same site.

Whatever you use, stick to one or else you'll have to make 2 sets of cookies for each domain to make your sessions/cookies work properly.

Related

passing session variable from without www to with www

Session variable is not passing between the urls without www and with www
I have implemented url rewriting in my website.
Example:
I want to pass Session["hashtable"] from http://domain.com/product.aspx page to http://www.domain.com/shoppingcart.aspx
Session is not passing between these urls,
tried a lot of settings in web.config adding httpcontext defaultname=.domain.com
still not working.
Any solution for this, great helpful for me.
Thanks in advance
Kumar
I've never done anything like this myself, but according to this question and answer, How can I share a session across multiple subdomains in ASP.NET?, you should be able to do this by adding this line to your web.config file:
<httpCookies domain=".usa.com"/>
Again, I haven't tried it. But it makes sense, I think.
I'm viewing your domain.com as being a "subdomain" of the same as www.domain.com. This isn't really true in a technical sense, but in a lot of cases I've seen it work that way anyway, so I'm hoping this will be one of those cases.
Edit:
I just came across this post, ASP.NET sharing session across multiple wildcard subdomains, which leads me to think this might not actually work, after all. Let me know if it doesn't and I'll delete this answer.
Edit 2:
Okay, I'm searching and finding tons and tons of people with this very problem. It sounds like it might just not be doable. That is, these people don't have any answers whatsoever. So it could just be that people who ask this are super unlucky, or it could be that it's just some little-known solution, or it could be that it's impossible. Thus, I'm adding in my comment from above here, and calling it a part of my "answer."
Just in the interest of good SEO practice for this site which, from the paths you provided, appears to be retail-related, you should probably just redirect all requests at any point to your www. domain, before you even set the session variables. If you can't do that, for instance if your www. root is not the same as your empty one (Which you should change anyway, for users' sakes), spin up another subdomain and point all requests to that. Then you'll have two domains at the same level, and the first solution I posted should handle that.

analytics code generate for one domain and used for another

I had a website hosted on the server X (86.115.xx.xxx) with the domain www.AAAAAA.ro. I've made significant changes in the website and also re-branded it. I want to host the new version of the website on the same server but I want to change it's domain to www.BBBBBB.ro. I've deleted the source code of the old website from the server and added the code of the new website on the server. Currently both www.AAAAAA.ro and www.BBBBBB.ro domains point at the X server. www.AAAAAA.ro domain is permanently redirected (301 code) to www.BBBBBB.ro domain. The redirect is made using the following htaccess statements:
RewriteCond %{HTTP_HOST} !^(www\.)?BBBBBB\.ro$ [NC]
RewriteRule ^(.*)$ http://BBBBBB.ro/$1 [R=301,L]
My question:
If I use the analytics code generated for the www.AAAAAA.ro domain on the new website (the one that has the www.BBBBBB.ro) will it work ?. Please keep in mind that www.AAAAAA.ro domain is permanently redirected(301 code) to www.BBBBBB.ro domain.
Thanks,
Mihai Despa
It depends.
If you are explicitly setting the domain in your GA code (_setDomainName), and it's set as your old domain, then the GA code is not going to trigger. This is because GA will be unable to write its cookie. So, you need to make sure this is either not set on your new domain, or else set it to the proper value.
All that aside, also note that the code will "work" in that it will continue to track. However, GA uses first party cookies. So that means the cookies for your already established visitors (set on your old domain) will NOT carry over to the new domain. This means any visitors who already came to your site and had cookies will basically start out as a new visitor.
Unfortunately, preserving the current visitors is not an easy thing, unless you are willing to do some client-side redirection for a while, instead of server-side redirection.
Why?
So here's the thing.. you see, GA is pretty devious about ensuring the visitor info isn't tampered with when carrying it over.
Normally the way it is done is you use _setAllowLinker in your on-page code and then use _link in a given link's onclick. This makes GA append its cookie info to the target URL to be carried over to the new domain. (note: if you do want to go for client-side redirect, then you would basically just invoke the _link call on-page load instead of onclick. It will handle the redirect).
So here's where things get devious. In addition to the cookies and values, GA also passes a __utmk parameter, which in a nutshell is hash generated from an amalgamation of the other parameters and a small piece of the current domain's soul. There is some complicated js trickery within ga.js to make this happen and so far I have yet to find a reverse-engineered version of it w/ server-side code.
Now, assuming you sell your own soul to the devil to learn how to reproduce their magic trick, you are now faced with the next issue: passing that stuff in your redirect.
Grabbing the other cookies and passing them as parameters is "easy enough" within .htaccess or server config file or w/e. But there isn't a script language available in that environment, so you'd have to use RewriteMap to reference an external script. That's not too bad, though it does complicate things somewhat. Oh also, RewriteMap cannot be declared in .htaccess so you can't stuff that there; have to put it in the server config file. If you are already doing your rewrite rule in there then easy peasy, but if you're doing your rewrite rule in .htaccess then be aware of this.
So to sum up:
you have to reverse engineer exactly how __utmk is generated in ga.js, and port that to a server-side script
you have to use RewriteMap to call that script to generate that value
add the __utmX cookies and that __utmk parameter to the redirected URL
also, on your new domain, you need to add _setAllowLinker to your on-page GA code so that it will look for the URL params and use them
Alternative to this mess? As mentioned, if you are willing to for a while use GA's _link call on page load as the redirect method from siteA to siteB, this will allow the GA code to append the params (including the generated __utmk variable) to the target URL and successfully migrate the visitor to the new site.
NOTE: you will also want to append the document.referrer to your target URL in the _link push, and then add some code on your new domain to grab that too and pop _setReferrerOverride with that value, so that GA will record the original referrer instead of the dummy redirect page.
You can even do a RewriteCondition to check for the existence of __utm cookies and if they exist, redirect the visitor to a single page that has the GA redirect on it. If it don't exist, let the RewriteRule redirect to the new domain as usual. This will cut down making new visitors having to jump through the hoop.
How long to let it client-side redirect? That depends on your longest conversion cycle. What are the conversion/goals for your site? For example, do you have a purchase funnel? How long on average does it take for a visitor to make a purchase? A day? A month? Pick your longest conversion cycle and wait about that long. There may be some edge cases that fall from the cracks from that, but that should be a lot more acceptable than losing ALL of them. Or, leave it up indefinitely if you want.
If you don't want to do even that? Well, pick a good time to deal with the visitor info slate being wiped clean, turn the rewrite rule on, and just deal with it.

Magento - prevent from browsing without rewrite

I have a problem with someone (using many IP addresses) browsing all over my shop using:
example.com/catalog/category/view/id/$i
I have URL rewrite turned on, so the usual human browsing looks "friendly":
example.com/category_name.html
Therefore, the question is - how to prevent from browsing the shop using "old" (not rewritten) URLs, leaving only "friendly" URLs allowed?
This is pretty important, since it is using hundreds of threads which is causing the shop to work really slow.
Since there are many random IP addresses, clearly you can't just block access from a single or small group of addresses. You may need to implement some logging that somehow identifies this crawler uniquely (maybe by browser agent, or possibly with some clever use of the Modernizr javascript library).
Once you've been able to distinguish some unique identifiers of this crawler, you could probably use a rule in .htaccess (if it's a user agent thing) to redirect or otherwise prevent them from consuming your server's oomph.
This SO question provides details on rules for user agents.
Block all bots/crawlers/spiders for a special directory with htaccess
If the spider crawls all the urls of the given pattern:
example.com/catalog/category/view/id/$i
then you can just kill these urls in a .htaccess. The rewrite is made internally from category.html -> /catalog/category/view/id/$i so, you only block the bots.
Once the rewrites are there ... They are there. They are stored in the Mage database for many reasons. One is crawlers like the one crawling your site. Another is users that might have the old page bookmarked. There are a number of methods individuals have come up with to go through and clean up your redirects (Google) ... But as it stands, in Magento, once they are there, they are not easily managed using Magento.
I might suggest generating a new site map and submitting it to the crawler affecting your site. Not only is this crawler going to be crawling tons of pages it doesn't need to, it's going to see duplicate content (bad ju ju).

is it possible to run multiple websites from the same URL?

i'm in the process of adding a US site to my current UK site. I'd like to do this as transaprently as possible so that we don't lose any traffic to existing links. We're currently running this under version 1.4.1.1 of Magento on a shared hosting setup.
The new website (US) will be essentially the same as the current (UK) site, but with US Dollar pricing instead of Pound Sterling.
We currently have a GeoIP setup whereby visitors are redirected to either UK or US site whilst utulising the same URL. This essentially means that we have switch statements in our index.php to indicate what run code to use.
Here's my question:
what's the best way of selecting/overriding the GeoIP selection via the standard store switcher selector dropbox? Both websites are being populated in the dropbox, however, since both are utilising the same URL (www.example.com/boutique) the default one is the only one that's being selected.
I've also tried the &_store= as well as the &_website= arguments with no success.
Any ideas? are URL rewrites in .htaccess the answer? if so, any ideas as what to use?
P.S. this is the method that's pretty much being followed however my aim is to let users override their location-specific website (e.g. US) if necessary:http://www.magentocommerce.com/wiki/4_-_themes_and_template_customization/navigation/multiple-website-setup#multiple_website_setup_for_useuuk_storespricing
Have you tried using a getUrl() method to build the store arguments for you? It can help clear up those little misunderstandings, for example I'm pretty sure the store parameter is supposed to have three underscores but cannot really remember so I use the function instead.
The best way to over-ride is to have a little php program, e.g. 'countries.php' that sets a cookie depending on the country code that you choose or 'auto' to test regular geoip. Then in your index.php have an 'if cookie then use cookie code else use geoip code'. Naturally the cookie can only be set by your test program.
And yes, you only need set 'website' not 'store'. There is no benefit in your US customers being able to see your UK prices (and vice-versa) so don't even bother with setting up a frontend drop-down. Or, if you really want, you can have rest-of-the-world customers choose their currency/website and put your own cookie-setting code in the header for them, with a couple of nice flag icons.

How many rewrite rules should I expect to manage?

I'm dealing with a hosting team that is fairly skiddish of managing many rewrite rules. What are your experiences with the number of rules your sites are currently managing?
I can see dozens (if not more) coming up as the site grows and contracts and need to set expectations that this isn't out of the norm.
Thanks
it should be normal that you have a lot of rewrite rules to your site. As the site gets bigger the amount of pages you need to rewrite grows, and depending on what the pages do, you could have multiple rewrites per page. This is all based on how secure you are making your site. More security means more precautions.
module gives you the ability to transparently redirect one URL to another, without the user’s knowledge. This opens up all sorts of possibilities, from simply redirecting old URLs to new addresses, to cleaning up the ‘dirty’ URLs coming from a poor publishing system — giving you URLs that are friendlier to both readers and search engines.
QUOTE
so pretty much it's at your discretion. How secure do you want it to be.

Resources