I need know when a GET request to a specific page is sent over my Squid server. I've set up Squid with SSL Bump for this and it works. Squid is only over HTTP and I can decode HTTPS requests anyway.
Browser setup with Squid proxy IP.
HTTPS request printed by Squid
1653523742.808 595 181.192.60.243 TCP_MISS/200 10264 GET https://example.com/page.aspx? - HIER_DIRECT/63.177.189.181 text/html
This is fine, and I trigger an event when I read this.
The problem is when I get the traffic from an AWS application LB
I get it encoded like this
1653523437.029 3 172.31.32.194 TAG_NONE/400 4057 POST / - HIER_NONE/- text/html
I need a way to be able to decode it.
Should I go further with Squid or LB setup, or should I go lower like reading TCP traffic?
Related
I need to put a web proxy in place to log user activity at work after a recent incident. My first thought was Squid proxy but after some research it seems that https requests are a total nightmare. These days more sites are https than http so I need to log both. Can anyone recommend a proxy server or otherwise to pass all http and https requests through to log?
Thanks
Squid can very well handle HTTP as well ass HTTPS traffic. How you should configure squid depends how you want the configure clients (I mean browser).
In general Squid proxy server can be configured to listen for both HTTP and HTTPS traffic on specific port (by default 3128) for squid and clients can be configured manually or using DHCP Option 252 + WPAD (Web Proxy Auto-Discovery Protocol).
Alternately Squid can be configured in transparent mode intercepting the traffic on your network, in this case Squid will listen on different ports for HTTP and HTTPS traffic.
Shahnawaz
I already checked Fiddler - tunnelled http requests to port 443 and Fiddler2: Decrypt HTTPS traffic and Tunnel to host:443, but my question is different.
I do not want to use Fiddler as a Proxy for another program. Instead, I simply want to use Fiddler's Composer Tab to send a HTTPS request over an upstream proxy. My proxy configuration and authorization is correct; sending HTTP requests works just fine.
When I use Fiddler's Composer to send an HTTPS GET to https://google.com, it results in a time-out (HTTP 502 / [Fiddler] The connection to 'google.com' failed. Error: TimedOut (0x274c).).
When I send an HTTPS CONNECT to https://google.com, I get HTTP 502 / [Fiddler] DNS Lookup for failed.
Does anybody know how I can establish an HTTPS tunnel over my proxy and then send a GET request?
to establish the tunnel, you must use CONNECT to the proxy. You must also include the host header, which doubles the destination in the CONNECT request... e.g.
CONNECT www.google.com:443 HTTP/1.1
Host: www.google.com
etc
Once the tunnel is up (e.g. you get a 200 OK from the proxy) you need to go into TLS handshake before you can send the http request (which since it's over TLS is now https). e.g.
GET / HTTP/1.1
Host: www.google.com
etc.
I am looking at achieving the following with squid proxy setup. When client sends a http request (say. http://, I would like my squid proxy to make this request into https and sends the request on behalf of the client and in turn will respond back to client unencrypted.
[client]---- http -----[squid proxy] --------https-----[server]
I would like to do this only for a certain set of URLS (dynamic list).
Can this be achieved?
How we can enable squid to cache web content (let says from firefox) for SSL connection, i mean for https URLs?
Actually SQUID can be used to access HTTPS traffic - it is in essence a man-in-the-middle attack - and there are caveats:
See: http://wiki.squid-cache.org/Features/SslBump
I have not tried cacheing this data yet, so can't say that it will work with absolute certainty. If/when I do, I'll update this post.
SSL encrypts the traffic between server and client so it cannot be read by a middle man. When using Squid as a proxy it simply cannot see the actual content in the traffic and therefore it has no means of caching it. The SSL traffic is just random bits that look different each time even if the same content is transferred multiple times and that is how encryption should work. It simply cannot be cached.
I have no problems getting Firefox (version 23.0.1 on Windows) to route SSL traffic via Squid. In Firefox Network Connection settings I just point SSL Proxy and HTTP Proxy to the same Squid installation.
After that I can successfully access https URLs in Firefox and in Squid's access_log I see entries like these:
1379660084.878 115367 10.0.0.205 TCP_MISS/200 6581 CONNECT www.gravatar.com:443 - DIRECT/68.232.35.121 -
Do you have any details about how it doesn't work for you? Squid has quite complicated possibilities to deny and allow certain types of traffic, so it is possible there is a configuration issue in Squid. Do you get any error messages in Squid's logfiles?
What additional changes are required to make this simple HTTP header to speak to a HTTPS enabled server.
GET /index.php HTTP/1.1
Host: localhost
[CR]
[CR]
EDIT
To add some context, all I'm trying to do is open a TCP port (443) and read the index page but the server seems to return a 400 - Bad request along with a message that goes "You're speaking plain HTTP to an SSL-enabled server port." I thought this probably meant altering the header in some fashion.
HTTP runs on top of secured channel. No adjustments are needed at all on HTTP level. You need to encrypt the whole traffic going to the socket (after it leaves HTTP client code) and decrypt the traffic coming from the socket before it reaches HTTP client.
You encrypt the payload with the information from the server to encrypt. This is done via handshake on a server by server basis, so you can't just fake it once have it work everywhere.
The payload includes the query string, cookies, form, etc.