Hosts file how do I allow only one domain name - macos

I'm setting up an iMAC in a store. They want to have the browser open to their website and restrict access so people can't use the computer to browse any other site. I see many discussions online but no actual code sample. Can you please write out the exact lines of code that I need to add to the hosts file and its location on a new iMAC with OS X.

You can achieve this by catching all web traffic and routing it to the IP address of the site you want to limit access to.
For example: If the website IP was 10.0.0.1
10.0.0.1 .com
10.0.0.1 .info
10.0.0.1 .org
(You can keep adding rows for each TLD (.com, .net etc) that you wish to block.
Any http requests sent from the machine would then try to resolve on that host.
This should prevent other websites from resolving but the one you require.

Related

Blocking websites from DNS Level

Is it possible to block website from DNS Level. Is it possible to create DNS server in windows server 2012?
I know you can do it pre-DNS pevel. Prior to DNS lookup, windows will check the hosts file for IP to domain mapping. You can set facebook.com to 192.168.1.1 and it will use that ip for facebook rather than looking it up in DNS, and thus blocking it. This would require modifying the host file on every machine you want to sensor though. Its a file in the system32 folder.

Why is wamp Apache not allowing APIs to access my www folder?

I have a piece of code where an external API needs to access my "www" folder for images. When I load the url, "http://localhost:8001/images/1.jpg" from the browser, it does show the image. But when I access it through the code it says, "connection refused". I have turned off the firewall as well. I also tried using the IP address instead of the "localhost".That doesn't work either. Please help.
Remember the domain name localhost has a special meaning. It always means this PC, or more accurately this network cards loopback address.
I cannot access your PC from here using the domain name localhost, as it will always be looped back to my PC.
If you want an external site to make a call to your PC then there are a number of things you will have to do.
Buy yourself an domain name, you either buy a real one or use a Dynamic DNS service like dyndns.com or or noip.com
Or you use your routers WAN ip address.
Then you must amend the httpd.conf file so that Apache allows access
from all ipaddress's
Then you must Port Forward your Router so the the NAT firewall allows
external accesses on port 80 to be forwarded to the internal PC
running Apache, and only that PC.
And possibly amend your software firewall on the Apache PC to allow access from external sources on port 80

I changed hosts file but can't see effects

I have to reach the wordpress platform installed by a certain hosting service, so I can build a new website (that will replace the old one), on a certain IP address. I changed my hosts file under their instructions. I put in it the IP address that they had given me and the website domain (separated by a single space). The hosts file hasn't any extension and it's in the right location (System32/drivers/etc - I'm on Windows 8). I cleaned browser and local DNS cache but nothing change: if I put in the browser the url they had given me (www.domain.com/?hostingname) I see the old website, not the wordpress platform. I tried to ping the domain and it returns a different IP address. What can I do? Thanks everyone in advance.
Ok, I solved. It's important to edit the hosts file with Windows Notepad and not with Notepad++

Recaptcha IP addresses

Okay, so we implement Recaptcha in production. We get errors because it can't reach the IP address it needs to use the service. We open a port for the IP address to reach Google. No problem. We do that and configure that IP address explicitly to work. It works great. Then, the next day, we start getting errors again because Recaptcha is using a different IP address. I can allow requests from that IP address, too, but now I'm unsettled. Where are these addresses coming from? How do I configure this to work reliably?
Recatpcha from Google can use any Google IP address and there are lots of them.
Ran this from Windows:
_netblocks.google.com text =
nslookup -type=TXT _netblocks.google.com
"v=spf1 ip4:216.239.32.0/19 ip4:64.233.160.0/19 ip4:66.249.80.0/20 ip4:72.14.192.0/18 ip4:209.85.128.0/17 ip4:66.102.0.0/20 ip4:74.125.0.0/16 ip4:64.18.0.0/20 ip4:207.126.144.0/20 ip4:173.194.0.0/16 ?all"
That's all the network Google uses currently. These can change so check them often.
Google suggest allowing port 80 to all IPs outbound, this highly insecure. They recommend going through a proxy server but again that is highly insecure if your web server is an DMZ. Proxy aware trojans do exist. All that need to be done is exploit a vulnerability to execute arbitrary code and you can create reverse connection on port 80 through a proxy server to download the payload. Then it is trivial to escalate privileges and own the box. I don't mean just Windows servers but Linux as well. I've done it in lab environment on security was on. It's really easy to do.
This is the Google website I got this from:
http://code.google.com/p/recaptcha/wiki/FirewallsAndRecaptcha
I wanted to append to this answer with more recent information. The documentation that Chris is pointing to does not include all of the TXT records necessary to dig (thanks Google):
_netblocks2.google.com (IPv6 subnets)
_netblocks3.google.com (Additional IPv4 subnets)
In my particular case, the _netblocks3 entry contained 2 large /19's that made my initial rule ineffective
(I found additional references here: https://support.google.com/a/answer/60764?hl=en)
Perhaps you should be using a hostname rather than IP

hosts file in windows and my developer

One of my sites - mediadeals.co.uk is showing a blank page.
So I went back to my developer. He asked me to add this on my hosts file
in windows->system32->drivers->etc->hosts
74.86.205.232 mediadeals.co.uk
After doing this the site started working. What does this mean?
Thats crazy. All he did was make it work on YOUR machine. The hosts file simply maps names to IP addresses. Its like a local DNS. What needs to happen for the outside world to see this the DNS servers that are authoritative for mediadeals.co.uk need to have an A record pointing to 74.86.205.232.
How long ago did you register that site name? Don't forget that DNS entries may take a while to propagate across the web. 24 hrs+ sometimes.
And btw, that "fix" will ONLY work on your machine. It maps the friendly URL to an IP address for you, not for the world.
The reason its not working is there is no DNS record for it.
The hosts file is allowing you to point via a local DNS replacement.
All you need is to get the site hosted somewhere and a DNS entry setup.
If you like the site and he is willing to host for $150 then go for it, depending on your contract, if he should have done in the initial budget then you should question this.
RE

Resources