I have managed to get my HTTP "proxy" connection to work but on most https connections I get the error:
Connection to failed.
The system returned: (111) Connection refused
Here is my config file I am currently using in squid:
http_port 3128
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern . 0 20% 4320
acl localnet src 10.0.0.0/8 # RFC 1918 possible internal network
acl localnet src 172.16.0.0/12 # RFC 1918 possible internal network
acl myhost src <myip>
http_access allow myhost
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
via off
forwarded_for off
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow all
dns_nameservers 8.8.8.8 8.8.4.4
Running on a Ubuntu 12.04 VPS - connecting remotely via the browser proxy settings page...
All I want to do is to be able to connect to my server and browse http and https sites. Http works with the config above... https does not...
Remove the following line:
http_access deny CONNECT !SSL_ports
i would recommend keeping
http_access deny CONNECT !SSL_ports
but allowing those listed by adding:
http_access allow SSL_ports
Related
I work on a project where there are multiple nexus registries behind different proxies :
How can I make sure that Gradle (or any repository related tool, such as NPM, maven, etc) can handle 3+ differents proxies at the same time to reach multiple Nexus instances ?
Until now, we were using a workaround : 1 nexus was accessed through HTTP proxy and 1 through HTTPS proxy. But now, we have 3 proxies to handle !
I think that it must be possible to add a machine (a squid instance ?) which would redirect proxy requests to the correct proxy, based on the domain name :
I'm not used to Squid and I still not managed to achieve this. Can anyone confirm if this is possible (or not) using Squid ? Does anyone would have another solution to suggest ?
Just for the background story, this network setting is due to multiple partner companies being involved in the project. We have access to each company Nexus through dedicated VPN and proxies.
OK, so I managed to run a Squid in a docker with the following config :
acl host src 10.0.0.0/8
acl host src 172.0.0.0/8
http_access allow host
maximum_object_size 256 MB
maximum_object_size_in_memory 256 MB
dns_nameservers 8.8.8.8 8.8.4.4
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localhost manager
http_access deny manager
http_access allow localhost
http_access deny all
http_port 3128
cache_peer 10.1.1.1 parent 8080 0 proxy-only no-query name=proxy1
cache_peer 10.2.1.1 parent 8080 0 proxy-only no-query name=proxy2
cache_peer 10.3.1.1 parent 8080 0 proxy-only no-query name=proxy3
acl sites_proxy1 dstdomain .domain1.com
acl sites_proxy2 dstdomain .domain2.com
acl sites_proxy3 dstdomain .domain3.com
cache_peer_access proxy1 deny !sites_proxy1
cache_peer_access proxy2 deny !sites_proxy2
cache_peer_access proxy3 deny !sites_proxy3
coredump_dir /var/spool/squid3
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern (Release|Packages(.gz)*)$ 0 20% 2880
refresh_pattern . 0 20% 4320
never_direct allow all
Then, I run the docker using this command :
docker run --rm --volume ~/DevTools/squid/squid.conf:/etc/squid/squid.conf -v ~/DevTools/squid/logs:/var/log/squid -p 3128:3128 datadog/squid
I am configuring squid as proxy to forward request and intend to allow request only to example.com. This is the request as made from my browser: http://111.222.333.444/http://example.com where 111.222.333.444 is IP of my proxy server, where squid is installed.
But I am getting Invalid URL error, while the server is trying to fetch /http://example.com (note leading slash), here is access log record:
1505858815.396 0 12.34.56.78 NONE/400 3687 GET /http://example.com - NONE/- text/html
Here is the configuration that I am using
acl manager proto cache_object
acl localhost src 127.0.0.1/32 ::1
acl to_localhost dst 127.0.0.0/8 0.0.0.0/32 ::1
acl localnet src all # Allow access from everywhere
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl CONNECT method CONNECT
acl safe_url url_regex example\.com.*
http_access allow safe_url
http_access deny all
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localnet
http_access allow localhost
http_access deny all
http_port 80
coredump_dir /var/spool/squid
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern -i (/cgi-bin/|\?) 0 0% 0
refresh_pattern . 0 20% 4320
What I am doing wrong?
Got it! squid is proxy hence need to do the proxy connection to the server. If using firefox it can be done by setting up proxy and entering proxy server IP and port number. Once done URLs can be used directly in this case, use this directly in the navigation bar: http://example.com
More on how to setup proxy on Firefox and Chrome
I've been desperately trying to setup squid on a virtual machine (hosted by digital ocean) for the past day or so. I've set up the machine which has Ubuntu operating system, installed Squid and modified the config as I'm trying to grant access to my home PC (as obviously they aren't on the same LAN net work). I thought this was done by making the edits I've shown below,
acl myips src MYPUBLICIP
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
http_access allow myips
http_access allow manager localhost
http_access allow manager myips
http_access allow purge localhost
http_access allow purge myips
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow localhost
http_access allow localhost manager
http_access deny manager
from my limited understanding I've allowed access to the top IP - should I be using my Laptops IPv4 address here (10.213.111.121)?
If someone could talk me through this I was be SO grateful as I'm really not getting anywhere...
Thanks!
I have a product form Symantec and their help is...less than helpful, including a nice message that says "Contact your reseller" in the "Contact Us" link. My reseller says to contact them. How? Anyways, it's a repackaged version if Squid for Windows. When I point IE to the proxy running locally I get "Access Denied. Access control configuration prevents your request from being allowed at this time. Please contact your service provider if you feel this is incorrect." However, when I point IE on another machine to the server running Squid everything works fine.
I have zero experience with Squid or proxies. I tried some different configs based on searches here but nothing worked. I'm sure it's something simple. Here is the config:
digest_generation off
hierarchy_stoplist cgi-bin ?
acl all src 0.0.0.0/0.0.0.0
cache deny all
maximum_object_size 0 KB
emulate_httpd_log on
debug_options ALL,1
cache_store_log none
access_log none
useragent_log none
auth_param ntlm program c:/clientsiteproxy/libexec/mswin_ntlm_auth.exe
auth_param ntlm children 80
auth_param ntlm keep_alive on
auth_param negotiate children 80
auth_param basic children 5
auth_param basic realm Squid proxy-caching web server
auth_param basic credentialsttl 2 hours
auth_param basic casesensitive off
authenticate_ip_shortcircuit_ttl 30 seconds
refresh_pattern ^ftp: 1440 20% 10080
refresh_pattern ^gopher: 1440 0% 1440
refresh_pattern . 0 20% 4320
read_timeout 15 minutes
acl manager proto cache_object
acl localhost src 127.0.0.1/255.255.255.255
acl to_localhost dst 127.0.0.0/8
acl SSL_ports port 443 563
acl Safe_ports port 80 # http
acl Safe_ports port 21 # ftp
acl Safe_ports port 443 563 # https, snews
acl Safe_ports port 70 # gopher
acl Safe_ports port 210 # wais
acl Safe_ports port 1025-65535 # unregistered ports
acl Safe_ports port 280 # http-mgmt
acl Safe_ports port 488 # gss-http
acl Safe_ports port 591 # filemaker
acl Safe_ports port 777 # multiling http
acl Smartconnect dstdomain ned.webscanningservice.com
acl CONNECT method CONNECT
acl authproxy proxy_auth REQUIRED
acl our_networks src 192.168.0.0/16 172.16.0.0/12 10.0.0.0/8 169.254.0.0/16
acl HEAD method HEAD
http_access allow manager localhost
http_access deny manager
http_access deny !Safe_ports
http_access deny CONNECT !SSL_ports
http_access allow HEAD
http_access deny !our_networks
http_access allow Smartconnect
http_access allow authproxy
http_access deny all
icp_access allow all
httpd_suppress_version_string on
visible_hostname ClientSiteProxy
forwarded_for off
header_access Via deny all
never_direct allow all
cache_dir null c:/ClientSiteProxy
coredump_dir c:/clientsiteproxy/var/cache
http_port 3128
This is most likely the culprit: http_access deny !our_networks. This statement denies outbound access for all source IPs apart from 192.168.0.0/16 172.16.0.0/12 10.0.0.0/8 169.254.0.0/16.
When you browse from the same machine, browser would bind on localhost, so you can try expanding the our_networks definition with 127.0.0.1.
I've been trying to configure Squid proxy in my local network.
Here is a snippet of my squid.conf file:
acl localnet src 10.0.0.0/8 # RFC1918 possible internal network
acl localnet src 172.16.0.0/12 # RFC1918 possible internal network
acl localnet src 192.168.0.0/16 # RFC1918 possible internal network
## Custom rules for allowing just the websites
acl AllowedSites dstdomain "c:/squid/etc/allowed.site"
#
acl SSL_ports port 443
acl Safe_ports port 80 # http
acl Safe_ports port 2367 # Skype
#acl Safe_ports port 21 # ftp
acl Safe_ports port 443 # https
#acl Safe_ports port 70 # gopher
#acl Safe_ports port 210 # wais
#acl Safe_ports port 1025-65535 # unregistered ports
#acl Safe_ports port 280 # http-mgmt
#acl Safe_ports port 488 # gss-http
#acl Safe_ports port 591 # filemaker
#acl Safe_ports port 777 # multiling http
acl CONNECT method CONNECT
# TAG: http_access
# Allowing or Denying access based on defined access lists
#
# Access to the HTTP port:
# http_access allow|deny [!]aclname ...
#
# NOTE on default values:
#
# If there are no "access" lines present, the default is to deny
# the request.
#
# If none of the "access" lines cause a match, the default is the
# opposite of the last line in the list. If the last line was
# deny, the default is allow. Conversely, if the last line
# is allow, the default will be deny. For these reasons, it is a
# good idea to have an "deny all" or "allow all" entry at the end
# of your access lists to avoid potential confusion.
#
#Default:
# http_access deny all
#
#Recommended minimum configuration:
#
# Only allow cachemgr access from localhost
acl numeric_IPs dstdom_regex ^(([0-9]+\.[0-9]+\.[0-9]+\.[0-9]+)|(\[([0-9af]+)?:([0-9af:]+)?:([0-9af]+)?\])):443
acl Skype_UA browser ^skype^
http_access allow manager localhost
http_access deny manager
# Deny requests to unknown ports
http_access deny !Safe_ports
# Deny CONNECT to other than SSL ports
http_access deny CONNECT !SSL_ports
#
# We strongly recommend the following be uncommented to protect innocent
# web applications running on the proxy server who think the only
# one who can access services on "localhost" is a local user
#http_access deny to_localhost
#
# INSERT YOUR OWN RULE(S) HERE TO ALLOW ACCESS FROM YOUR CLIENTS
# Example rule allowing access from your local networks.
# Adapt localnet in the ACL section to list your (internal) IP networks
# from where browsing should be allowed
http_access allow localnet
http_access allow AllowedSites
http_access allow CONNECT localnet numeric_IPs Skype_UA
http_access deny !AllowedSites
# And finally deny all other access to this proxy
http_access deny all
Now, the problem here is, that when I allow skype, it starts allowing ALL websites.
I need a way by which I can restrict the websites just to the allowed.site file domains, which contains a list of allowed sites.
Also, I need to block port 443, but allow that same port for Skype.
Please guide me on how this can be possible.
Thanks,