Running Squid on localhost - proxy

I have a product form Symantec and their help is...less than helpful, including a nice message that says "Contact your reseller" in the "Contact Us" link. My reseller says to contact them. How? Anyways, it's a repackaged version if Squid for Windows. When I point IE to the proxy running locally I get "Access Denied. Access control configuration prevents your request from being allowed at this time. Please contact your service provider if you feel this is incorrect." However, when I point IE on another machine to the server running Squid everything works fine.
I have zero experience with Squid or proxies. I tried some different configs based on searches here but nothing worked. I'm sure it's something simple. Here is the config:
digest_generation off
hierarchy_stoplist cgi-bin ?
acl all src 0.0.0.0/0.0.0.0
cache deny all
maximum_object_size 0 KB
emulate_httpd_log on
debug_options ALL,1
cache_store_log none
access_log none
useragent_log none
auth_param ntlm program c:/clientsiteproxy/libexec/mswin_ntlm_auth.exe
auth_param ntlm children 80
auth_param ntlm keep_alive on
auth_param negotiate children 80
auth_param basic children 5
auth_param basic realm Squid proxy-caching web server
auth_param basic credentialsttl 2 hours
auth_param basic casesensitive off
authenticate_ip_shortcircuit_ttl 30 seconds
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern . 0 20% 4320
read_timeout 15 minutes
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl Smartconnect dstdomain ned.webscanningservice.com
acl CONNECT method CONNECT
acl authproxy proxy_auth REQUIRED
acl our_networks src 192.168.0.0/16 172.16.0.0/12 10.0.0.0/8 169.254.0.0/16
acl HEAD method HEAD
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow HEAD
http_access deny !our_networks
http_access allow Smartconnect
http_access allow authproxy
http_access deny all
icp_access allow all
httpd_suppress_version_string on
visible_hostname ClientSiteProxy
forwarded_for off
header_access Via deny all
never_direct allow all
cache_dir null c:/ClientSiteProxy
coredump_dir c:/clientsiteproxy/var/cache
http_port 3128

This is most likely the culprit: http_access deny !our_networks. This statement denies outbound access for all source IPs apart from 192.168.0.0/16 172.16.0.0/12 10.0.0.0/8 169.254.0.0/16.
When you browse from the same machine, browser would bind on localhost, so you can try expanding the our_networks definition with 127.0.0.1.

Related

How can I make gradle deal with multiple proxies?

I work on a project where there are multiple nexus registries behind different proxies :
How can I make sure that Gradle (or any repository related tool, such as NPM, maven, etc) can handle 3+ differents proxies at the same time to reach multiple Nexus instances ?
Until now, we were using a workaround : 1 nexus was accessed through HTTP proxy and 1 through HTTPS proxy. But now, we have 3 proxies to handle !
I think that it must be possible to add a machine (a squid instance ?) which would redirect proxy requests to the correct proxy, based on the domain name :
I'm not used to Squid and I still not managed to achieve this. Can anyone confirm if this is possible (or not) using Squid ? Does anyone would have another solution to suggest ?
Just for the background story, this network setting is due to multiple partner companies being involved in the project. We have access to each company Nexus through dedicated VPN and proxies.
OK, so I managed to run a Squid in a docker with the following config :
acl host src 10.0.0.0/8
acl host src 172.0.0.0/8
http_access allow host
maximum_object_size 256 MB
maximum_object_size_in_memory 256 MB
dns_nameservers 8.8.8.8 8.8.4.4
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localhost manager
http_access deny manager
http_access allow localhost
http_access deny all
http_port 3128
cache_peer 10.1.1.1 parent 8080 0 proxy-only no-query name=proxy1
cache_peer 10.2.1.1 parent 8080 0 proxy-only no-query name=proxy2
cache_peer 10.3.1.1 parent 8080 0 proxy-only no-query name=proxy3
acl sites_proxy1 dstdomain .domain1.com
acl sites_proxy2 dstdomain .domain2.com
acl sites_proxy3 dstdomain .domain3.com
cache_peer_access proxy1 deny !sites_proxy1
cache_peer_access proxy2 deny !sites_proxy2
cache_peer_access proxy3 deny !sites_proxy3
coredump_dir /var/spool/squid3
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern (Release|Packages(.gz)*)$ 0 20% 2880
refresh_pattern . 0 20% 4320
never_direct allow all
Then, I run the docker using this command :
docker run --rm --volume ~/DevTools/squid/squid.conf:/etc/squid/squid.conf -v ~/DevTools/squid/logs:/var/log/squid -p 3128:3128 datadog/squid

squid remove leading slash from target url

I am configuring squid as proxy to forward request and intend to allow request only to example.com. This is the request as made from my browser: http://111.222.333.444/http://example.com where 111.222.333.444 is IP of my proxy server, where squid is installed.
But I am getting Invalid URL error, while the server is trying to fetch /http://example.com (note leading slash), here is access log record:
1505858815.396 0 12.34.56.78 NONE/400 3687 GET /http://example.com - NONE/- text/html
Here is the configuration that I am using
acl manager proto cache_object
acl localhost src 127.0.0.1/32 ::1
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1
acl localnet src all # Allow access from everywhere
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl CONNECT method CONNECT
acl safe_url url_regex example\.com.*
http_access allow safe_url
http_access deny all
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localnet
http_access allow localhost
http_access deny all
http_port 80
coredump_dir /var/spool/squid
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320
What I am doing wrong?
Got it! squid is proxy hence need to do the proxy connection to the server. If using firefox it can be done by setting up proxy and entering proxy server IP and port number. Once done URLs can be used directly in this case, use this directly in the navigation bar: http://example.com
More on how to setup proxy on Firefox and Chrome

Struggling to grant remote HTTP access on a proxy server

I've been desperately trying to setup squid on a virtual machine (hosted by digital ocean) for the past day or so. I've set up the machine which has Ubuntu operating system, installed Squid and modified the config as I'm trying to grant access to my home PC (as obviously they aren't on the same LAN net work). I thought this was done by making the edits I've shown below,
acl myips src MYPUBLICIP
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access allow myips
http_access allow manager localhost
http_access allow manager myips
http_access allow purge localhost
http_access allow purge myips
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localhost
http_access allow localhost manager
http_access deny manager
from my limited understanding I've allowed access to the top IP - should I be using my Laptops IPv4 address here (10.213.111.121)?
If someone could talk me through this I was be SO grateful as I'm really not getting anywhere...
Thanks!

Squid Connection Refused HTTPS

I have managed to get my HTTP "proxy" connection to work but on most https connections I get the error:
Connection to failed.
The system returned: (111) Connection refused
Here is my config file I am currently using in squid:
http_port 3128
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern . 0 20% 4320
acl localnet src 10.0.0.0/8 # RFC 1918 possible internal network
acl localnet src 172.16.0.0/12 # RFC 1918 possible internal network
acl myhost src <myip>
http_access allow myhost
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
via off
forwarded_for off
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow all
dns_nameservers 8.8.8.8 8.8.4.4
Running on a Ubuntu 12.04 VPS - connecting remotely via the browser proxy settings page...
All I want to do is to be able to connect to my server and browse http and https sites. Http works with the config above... https does not...
Remove the following line:
http_access deny CONNECT !SSL_ports
i would recommend keeping
http_access deny CONNECT !SSL_ports
but allowing those listed by adding:
http_access allow SSL_ports

False configured mod_rewrite/mod_proxy with squid proxy

We are using squid for one of our applications and we are getting an abuse message from a website saying He used xrumer or other Tools or had a false configured mod_rewrite/mod_proxy who is abused.
We doubt that someone else is using these squid proxies on our behalf or these configurations are actually not properly set. I don't have much experience with squid or Apache mod rules.
What might be the issue?
squid.conf is
#######
####### Recommended minimum Access Permission configuration:
#######
####### Only allow cachemgr access from localhost
http_access allow manager localhost
http_access deny manager
####### Deny requests to certain unsafe ports
http_access deny !Safe_ports
####### Deny CONNECT to other than secure SSL ports
http_access deny CONNECT !SSL_ports
####### We strongly recommend the following be uncommented to protect innocent
####### web applications running on the proxy server who think the only
####### one who can access services on "localhost" is a local user
#######http_access deny to_localhost
#######
####### Recommended minimum configuration:
#######
acl all src all
acl manager proto cache_object
acl localhost src 127.0.0.1/32 ::1
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1
####### Example rule allowing access from your local networks.
####### Adapt to list your (internal) IP networks from where browsing
####### should be allowed
acl localnet src 10.0.0.0/8 ####### RFC1918 possible internal network
acl localnet src 172.16.0.0/12 ####### RFC1918 possible internal network
acl localnet src 192.168.0.0/16 ####### RFC1918 possible internal network
acl localnet src fc00::/7 ####### RFC 4193 local private network range
acl localnet src fe80::/10 ####### RFC 4291 link-local (directly plugged) machines
acl SSL_ports port 443
acl Safe_ports port 80 ####### http
acl Safe_ports port 21 ####### ftp
acl Safe_ports port 443 ####### https
acl Safe_ports port 70 ####### gopher
acl Safe_ports port 210 ####### wais
acl Safe_ports port 1025-65535 ####### unregistered ports
acl Safe_ports port 280 ####### http-mgmt
acl Safe_ports port 488 ####### gss-http
acl Safe_ports port 591 ####### filemaker
acl Safe_ports port 777 ####### multiling http
acl CONNECT method CONNECT
#######
####### INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
#######
####### Example rule allowing access from your local networks.
####### Adapt localnet in the ACL section to list your (internal) IP networks
####### from where browsing should be allowed
http_access allow localnet
http_access allow localhost
####### Changed by XYZ (XYZ#ABC) to allow http_access from ALL
http_access allow all
####### And finally deny all other access to this proxy. Changed by Ankit Narang (ankitn#amazon.com).
#######http_access deny all
####### Squid normally listens to port 3128
http_port 80
####### We recommend you to use at least the following line.
hierarchy_stoplist cgi-bin ?
####### Uncomment and adjust the following to add a disk cache directory.
#######cache_dir ufs /var/spool/squid 100 16 256
####### Leave coredumps in the first cache dir
coredump_dir /var/spool/squid
####### Add any of your own refresh_pattern entries above these.
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320
####### Changed by Ankit Narang (ankitn#amazon.com)
forwarded_for off
access_log none
#######Making Squid an anonymous proxy
request_header_access Allow allow all
request_header_access Authorization allow all
request_header_access WWW-Authenticate allow all
request_header_access Proxy-Authorization allow all
request_header_access Proxy-Authenticate allow all
request_header_access Cache-Control allow all
request_header_access Content-Encoding allow all
request_header_access Content-Length allow all
request_header_access Content-Type allow all
request_header_access Date allow all
request_header_access Expires allow all
request_header_access Host allow all
request_header_access If-Modified-Since allow all
request_header_access Last-Modified allow all
request_header_access Location allow all
request_header_access Pragma allow all
request_header_access Accept allow all
request_header_access Accept-Charset allow all
request_header_access Accept-Encoding allow all
request_header_access Accept-Language allow all
request_header_access Content-Language allow all
request_header_access Mime-Version allow all
request_header_access Retry-After allow all
request_header_access Title allow all
request_header_access Connection allow all
request_header_access Proxy-Connection allow all
request_header_access User-Agent allow all
request_header_access Cookie allow all
request_header_access All deny all
Thanks in advance.
http_access allow all
That's definitively wrong. You have configured an open proxy, so everybody in the world can use your proxy to surf the web. I don't know your scensario exactly, when you MUST restrict the access in some way.
For example, if you are using Squid as reverse proxy, you should allow the access only to your backend web server:
acl webserver dst x.x.x.x
http_access allow webserver
http_access deny all
If you are using your proxy as forward proxy to browse the web:
acl mynetwork src x.x.x.x/y
http_access allow mynetwork
http_access deny all
You can algo use authentication.

Resources