Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 10 years ago.
Improve this question
I'm trying to block a website using the hosts file using this tutorial : http://hackspc.com/how-to-block-a-website/
but It doesn't work, the website I'v blocked In this case facebook still shows up, please can anyone help me out here?
link textI could not access the link (blocked in office) but i think this may help you edit your host file
"“WWW” has become the universal standard for the default host. It is just as common to define a site with no host as well. This means that as far as DNS goes www.yahoo.com & yahoo.com are two totally different sites, even though they resolve to the same place. Because of this to effectively block the site, you must also block all hosts. This would usually mean:
127.0.0.1 facebook.com
127.0.0.1 www.facebook.com"
check the link for more details.
Also i you trying to block multiple sites, its better to have a proper software like proxy server or firewalls which can block access to particular sites.
I think Kavitesh Singh made the most important point: Blocking the domain with and without www. this is the most common reason for an entry not working.
Also, not all browsers immediately react to changes in the hosts file. Have you tried re-starting your browser and / or system?
Related
Closed. This question is not reproducible or was caused by typos. It is not currently accepting answers.
This question was caused by a typo or a problem that can no longer be reproduced. While similar questions may be on-topic here, this one was resolved in a way less likely to help future readers.
Closed 7 years ago.
Improve this question
i'm running a website in college LAN which is accessed by 200+ members on mobile only.
they are accessing it by server ipaddress 192.168.0.***.i wanna assign a name for this ipaddress.i've tried to modify the hosts file. it worked on system alone,but didnt worked on mobile.
i need some suggestions. im using xampp as server
everything works on localhost, no internet connection is involved anywhere
hosts file
127.0.0.1 yep
192.168.0.110 yep.com
you forgot the point before and after the variables.
$query="insert into logs(`from`,`to`) VALUES ( '".$caller."', '".$callto."')";
btw change your quote to double quotes just like my code.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
I would like to test automatically my website from different locations in order to localize content's presentation. I think I have to write a bash script to access the website with wget program, using an ip from a list. There is somewhere an established solution to this kind of problem ?
There is many solutions. I think to these ones :
IP spoofing. But it's not easy. In particular if you want orchestrate these tests to automate them...
Another solution is to use a reverse-proxy. An example: your application is hosted by Tomcat and you use Apache as reverse proxy. In this case you can easily configure several end-points in Apache where you lie about XFF
Another solution, you can rent VM in the cloud. This is a good approach if you want to perform real performance tests from a remote client, or check the behavior of Internet cache...
Some compagnies sells services to check availability of your web-stuff from different sites.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 8 years ago.
Improve this question
I'm going up to the mountains with no internet connection to present something. I'd like to be able to use interactive examples since I'll be presenting on a certain website.
So is there a way I can set up a proxy caching server or something to cache every call made in order to have a fully cached website experience with no internet connection?
I've looked at http://squidman.net/ but I'm not sure how it works or how to use it.
You might want to try something like this. It might be a lot more work than the steps below, but this could be a good starting point.
Create a local proxy server along with memcache or redis
Update the browser proxy settings to use your proxy server details
Make the local server look for the url in the redis server.
If found, return the data in the redis server
Else, do a web request and store the data in the redis server
You'll have to do this manually for the pages that you want while you have the internet connection. Once you've got all the data you need, you can work without the internet connection too.
If the pages are essentially static then you could use something like HTTrack http://www.httrack.com/ to make an offline copy
If there's anything requiring server side interaction or dynamic generation of pages you're most likely going to need to run your own local instance of the server.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 12 years ago.
Improve this question
In a domain name, mostly started with prefix 'www'.is it used like a standard? ifnot, let me know, why?
www stands for world wide web. Many web addresses begin with www, because of the long-standing practice of naming Internet hosts (servers) according to the services they provide.
WWW prefix
A FQDN starting with "www." is used, by convention, for the machine hosting the primary website for a domain.
These days, where websites are more important then they were in the early days of the Internet, it is conventional to also run a webserver on the main machine for the domain (i.e. example.com rather than www.example.com) (these are usually the same machine) and redirect from one to the other.
Closed. This question is off-topic. It is not currently accepting answers.
Want to improve this question? Update the question so it's on-topic for Stack Overflow.
Closed 9 years ago.
Improve this question
Is there a service that will identify where a site is hosted (presumably by IP)?
Who-hosts is an online free service that can tell you which is the company that hosts the provided URL, and doesn't require registration.
tracert www.sitename.com
is probably your best bet. The last entry or two should give you your best hint. Otherwise, the whois entry may be a good indicator as well, especially if they are using a hosting provider for DNS.
EDIT:
Its traceroute not tracert on linux machines.
Just do a whois search on the IP.
http://samspade.org/whois/ is a free utility for telling you who owns an IP address or domain name. If this is a server farm hosting multiple servers, then it will likely be registered to the hosting company.
This isn't exactly what the question asked for, but you might find it useful to know that Netcraft provides some pretty neat information about the uptime, web-server software, and ISP used to host websites as well.
Domaintools can usually give you some pretty good information, under the "Server Data" and using the "Reverse IP" tool (though you have to pay to get full results from that one).
http://whois.domaintools.com/websitename.com
just put the website name in instead of websitename.com.