Gradle download timeout/retry - gradle

I am on a flaky network (or there is some kind of proxy or virus checker in the way) so my gradle dependencies downloads (external module dependencies mavenCentral()) hangs sometimes.
A local repo would help, but are there any settings for timeouts and retries?
The download starts, and then it hangs, and times out after the default socket timeout,
I can emulate this with wget
wget -d
http://repo1.maven.org/maven2/org/apache/santuario/xmlsec/1.5.2/xmlsec-1.5.2-sources.jar
DEBUG output created by Wget 1.11.4 on Windows-MSVC.
--2013-01-23 13:52:01-- http://repo1.maven.org/maven2/org/apache/santuario/xmls
ec/1.5.2/xmlsec-1.5.2-sources.jar Resolving repo1.maven.org... seconds
0.00, 68.232.34.223 Caching repo1.maven.org => 68.232.34.223 Connecting to repo1.maven.org|68.232.34.223|:80... seconds 0.00,
connected. Created socket 352. Releasing 0x003311d0 (new refcount 1).
---request begin--- GET /maven2/org/apache/santuario/xmlsec/1.5.2/xmlsec-1.5.2-sources.jar
HTTP/1.0 User-Agent: Wget/1.11.4 Accept: / Host: repo1.maven.org
Connection: Keep-Alive
---request end--- HTTP request sent, awaiting response...
---response begin--- HTTP/1.0 200 OK Accept-Ranges: bytes Content-Type: application/java-archive Date: Wed, 23 Jan 2013 12:52:01
GMT Last-Modified: Mon, 14 May 2012 08:47:03 GMT Server: ECAcc
(lhr/4ABA) X-Cache: HIT Content-Length: 577534 Connection: keep-alive
---response end--- 200 OK Registered socket 352 for persistent reuse. Length: 577534 (564K) [application/java-archive] Saving to:
`xmlsec-1.5.2-sources.jar.1'
5% [=> ] 33,328 --.-K/s eta
17m 52s ^
I would like it to timeout faster and retry the download,

Related

wget gives 403 on accessible files

First time poster with a bizarre issue I am having. I usually install software through conda, but from one moment to the other I stopped being able to use conda install because of a 403 I get from conda trying to access some configuration files. When trying to download those files with wget --spider --debug https://conda.anaconda.org/anaconda/noarch/current_repodata.json, I get the same 403 error.
DEBUG output created by Wget 1.19.4 on linux-gnu.
Reading HSTS entries from /home/jsequeira/.wget-hsts
URI encoding = ‘UTF-8’
Converted file name 'current_repodata.json' (UTF-8) -> 'current_repodata.json' (UTF-8)
Spider mode enabled. Check if remote file exists.
--2020-07-30 11:25:59-- https://conda.anaconda.org/anaconda/noarch/current_repodata.json
Resolving conda.anaconda.org (conda.anaconda.org)... 104.17.92.24, 104.17.93.24, 2606:4700::6811:5d18, ...
Caching conda.anaconda.org => 104.17.92.24 104.17.93.24 2606:4700::6811:5d18 2606:4700::6811:5c18
Connecting to conda.anaconda.org (conda.anaconda.org)|104.17.92.24|:443... connected.
Created socket 5.
Releasing 0x000056545deb1850 (new refcount 1).
Initiating SSL handshake.
Handshake successful; connected socket 5 to SSL handle 0x000056545deb2700
certificate:
subject: CN=anaconda.org,O=Cloudflare\\, Inc.,L=San Francisco,ST=CA,C=US
issuer: CN=Cloudflare Inc ECC CA-3,O=Cloudflare\\, Inc.,C=US
X509 certificate successfully verified and matches host conda.anaconda.org
---request begin---
HEAD /anaconda/noarch/current_repodata.json HTTP/1.1
User-Agent: Wget/1.19.4 (linux-gnu)
Accept: */*
Accept-Encoding: identity
Host: conda.anaconda.org
Connection: Keep-Alive
---request end---
HTTP request sent, awaiting response...
---response begin---
HTTP/1.1 403 Forbidden
Date: Thu, 30 Jul 2020 11:25:59 GMT
Content-Type: text/html; charset=UTF-8
Connection: close
CF-Chl-Bypass: 1
Set-Cookie: __cfduid=d3cd3a67d3926551371d8ffe5a840b04f1596108359; expires=Sat, 29-Aug-20 11:25:59 GMT; path=/; domain=.anaconda.org; HttpOnly; SameSite=Lax
Cache-Control: private, max-age=0, no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Expires: Thu, 01 Jan 1970 00:00:01 GMT
X-Frame-Options: SAMEORIGIN
cf-request-id: 044111dd9600005d4732b73200000001
Expect-CT: max-age=604800, report-uri="https://report-uri.cloudflare.com/cdn-cgi/beacon/expect-ct"
Vary: Accept-Encoding
Server: cloudflare
CF-RAY: 5baeb8dc2ba65d47-LIS
---response end---
403 Forbidden
cdm: 1
Stored cookie anaconda.org -1 (ANY) / <permanent> <insecure> [expiry 2020-08-29 11:25:59] __cfduid d3cd3a67d3926551371d8ffe5a840b04f1596108359
URI content encoding = ‘UTF-8’
Closed 5/SSL 0x000056545deb2700
Remote file does not exist -- broken link!!!
These files are accessible through the browser, and were always accessible with wget and conda until yesterday, when I was installing some tools not related to these network accesses. How can wget fail to download them?
So this was fixed by reinstalling apt-get. Some configuration file there must have been messed up.

New cabal install repeating error

I've just install the full Haskell platform from https://www.haskell.org/platform/windows.html onto a 64-bit Windows 7 machine. I have followed step 3 from this page.
I had to uninstall 8.0.2 before installing this version, 8.2.1
Whatever command I run with cabal now it gives the same error. Please see the below sequence of commands and errors. This was done in an empty directory:
D:\test>cabal init
dieVerbatim: user error (cabal: Failed to download
http://objects-us-west-1.dream.io/hackage-mirror/root.json : No Status Code
could be parsed from response: --17:29:30--
http://objects-us-west-1.dream.io/hackage-mirror/root.json
=> `C:\Users\BEN~1.CRA\AppData\Local\Temp\transportAdapterGet570528145'
Connecting to objects-us-west-1.dream.io:80... connected!
HTTP request sent, awaiting response... 200 OK
2 Content-Length: 3850
3 Accept-Ranges: bytes
4 Last-Modified: Mon, 12 Sep 2016 12:14:29 GMT
5 ETag: "c5688ef68afb3f6186d35162423bd8c6"
6 x-amz-request-id: tx0000000000000003f6055-0059e0e9ea-19c1b67c-default
7 Content-Type: application/json
8 Date: Fri, 13 Oct 2017 16:29:30 GMT
9 Connection: keep-alive
10
0K ... 100% # 3.67 MB/s
17:29:30 (3.67 MB/s) -
`C:\Users\BEN~1.CRA\AppData\Local\Temp\transportAdapterGet570528145' saved
[3850/3850]
FINISHED --17:29:30--
Downloaded: 3,850 bytes in 1 files
)
D:\test>cabal configure
dieVerbatim: user error (cabal: Failed to download
http://objects-us-west-1.dream.io/hackage-mirror/root.json : No Status Code
could be parsed from response: --17:29:35--
http://objects-us-west-1.dream.io/hackage-mirror/root.json
=> `C:\Users\BEN~1.CRA\AppData\Local\Temp\transportAdapterGet570528145'
Connecting to objects-us-west-1.dream.io:80... connected!
HTTP request sent, awaiting response... 200 OK
2 Content-Length: 3850
3 Accept-Ranges: bytes
4 Last-Modified: Mon, 12 Sep 2016 12:14:29 GMT
5 ETag: "c5688ef68afb3f6186d35162423bd8c6"
6 x-amz-request-id: tx0000000000000001249f3-0059e0e9f0-19c8c27c-default
7 Content-Type: application/json
8 Date: Fri, 13 Oct 2017 16:29:36 GMT
9 Connection: keep-alive
10
0K ... 100% # 3.67 MB/s
17:29:36 (3.67 MB/s) -
`C:\Users\BEN~1.CRA\AppData\Local\Temp\transportAdapterGet570528145' saved
[3850/3850]
FINISHED --17:29:36--
Downloaded: 3,850 bytes in 1 files
)
D:\test>cabal install cabal-install
dieVerbatim: user error (cabal: Failed to download
http://objects-us-west-1.dream.io/hackage-mirror/root.json : No Status Code
could be parsed from response: --17:29:45--
http://objects-us-west-1.dream.io/hackage-mirror/root.json
=> `C:\Users\BEN~1.CRA\AppData\Local\Temp\transportAdapterGet299511942'
Connecting to objects-us-west-1.dream.io:80... connected!
HTTP request sent, awaiting response... 200 OK
2 Content-Length: 3850
3 Accept-Ranges: bytes
4 Last-Modified: Mon, 12 Sep 2016 12:14:29 GMT
5 ETag: "c5688ef68afb3f6186d35162423bd8c6"
6 x-amz-request-id: tx0000000000000003f626b-0059e0e9f9-19c1b67c-default
7 Content-Type: application/json
8 Date: Fri, 13 Oct 2017 16:29:45 GMT
9 Connection: keep-alive
10
0K ... 100% # 3.67 MB/s
17:29:45 (3.67 MB/s) -
`C:\Users\BEN~1.CRA\AppData\Local\Temp\transportAdapterGet299511942' saved
[3850/3850]
FINISHED --17:29:45--
Downloaded: 3,850 bytes in 1 files
)
The temporary file(s) it is referring to does not exist. I am able to access the dream.io from my machine using Chrome.
I am new to Haskell/cabal development, so apologies if I'm missing something obvious.
Reposting the correct answer as communitywiki for posterity:
https://github.com/haskell/cabal/issues/4747#issuecomment-327888643 has some suggested workarounds, including cabal --http-transport=plain-http update and "Adding wget or curl to Path"

How to cache API calls via Varnish?

My goal is to cache RESTful API calls via Varnish. AS I was reading on stackoverflow and other resources, Varnish can not cache post requests. This is exactly what I am experiencing. Therefore I moved to get with ?id=30 but then I realized that those also do not get cached because of the question mark.
So the question is, how o cache API-Calls over Varnish?
Here are two example calls to my API, secured by Oauth2 with 2 parameters pased by post:
curl --insecure -s -k https://test-api/v1/test -d 'access_token=72f50e68a0aed7921c6cb058de8e7e6ed4ebd692&clid=585970' -D- -o/dev/null
HTTP/1.1 200 OK
Date: Sat, 29 Aug 2015 13:02:36 GMT
Server: Apache/2.2.31 (Unix) PHP/5.6.11
X-Powered-By: PHP/5.6.11
Vary: Accept-Encoding
Content-Length: 2121
Content-Type: application/json
X-Varnish: 6558010
Age: 0
Via: 1.1 varnish-v4
Accept-Ranges: bytes
Set-Cookie: SERVERID=S3; path=/
curl --insecure -s -k https://test-api/v1/test -d 'access_token=72f50e68a0aed7921c6cb058de8e7e6ed4ebd692&clid=585970' -D- -o/dev/null
HTTP/1.1 200 OK
Date: Sat, 29 Aug 2015 13:02:56 GMT
Server: Apache/2.2.31 (Unix) PHP/5.6.11
X-Powered-By: PHP/5.6.11
Vary: Accept-Encoding
Content-Length: 2121
Content-Type: application/json
X-Varnish: 12814168
Age: 0
Via: 1.1 varnish-v4
Accept-Ranges: bytes
Set-Cookie: SERVERID=S2; path=/
Is it possible to configure Varnish to cache the API calls? POST/GET either way I don't mind.

Parse Security: able to see request/response in plain-text via proxy server

I'm attempting to hack the Parse SDK, and it seems that we are able to see requests and responses in plain text via a proxy server between Parse and the app. I assumed the data was encrypted, but a malicious user is able to see our requests and modify them to essentially pull out all of our user information.
Does anyone have any ideas on this?
Here is an example of a custom request and response via Proxy:
POST /1/classes/_User HTTP/1.1
Host: api.parse.com
Content-Type: application/json; charset=utf8
Cookie: _parse_session=---
Accept: */*
Proxy-Connection: keep-alive
X-Parse-Application-Id: ---
X-Parse-Client-Key: ---
X-Parse-Installation-Id: ---
Accept-Encoding: gzip, deflate
X-Parse-OS-Version: 8.2 (12D508)
Accept-Language: en-us
X-Parse-Client-Version: i1.6.5
Content-Length: 51
Connection: keep-alive
X-Parse-App-Build-Version: 11
X-Parse-App-Display-Version: 1.0.0
{"where":{"email":"joe#joe.com"},"_method":"GET"}
HTTP/1.1 200 OK
Access-Control-Allow-Methods: *
Access-Control-Allow-Origin: *
Content-Type: application/json; charset=utf-8
Date: Fri, 10 Apr 2015 01:02:55 GMT
Server: nginx/1.6.0
X-Parse-Platform: G1
X-Runtime: 0.013113
Content-Length: 246
Connection: keep-alive
{"results":[{"company":"","createdAt":"2015-04-10T01:02:35.670Z","discoverable":true,"email":"joe#joe.com","firstName":"Joe","lastName":"Smith","objectId":"yPTx1kyHei","title":"","updatedAt":"2015-04-10T01:02:35.670Z","username":"joe#joe.com"}]}

Jmeter- HTTP Cache Manager, Unable to cache everything what it is being cached by Browser

I used HTTP Chache Manager to Cache files which are being cached in browser. I am successful of doing it for some of the pages. Number of files being cached in Jmeter is equal to Number of files being cached by browser.
But in some cases :
I found number files being cached is lesser than the files being cached by browser.
Using Jmeter I found only 5 files are being cached but in real browser 12 files are getting cached.
Header for one file which is cached in Chrome but not in Jmeter
Header in Chrome Browser:
Remote Address:
Request URL:
Request Method:GET
Status Code:304 Not Modified
Request Headersview source
Accept:image/webp,/;q=0.8
Accept-Encoding:gzip,deflate,sdch
Accept-Language:en-GB,en;q=0.8,it-CH;q=0.6,it;q=0.4,ar;q=0.2
Cache-Control:max-age=0
Connection:keep-alive
Cookie:
Host:
If-Modified-Since:Thu, 16 Jan 2014 16:38:32 GMT
If-None-Match:W/"1242-1389890312000"
Referer:
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.1916.114 Safari/537.36
Response Headersview source
Cache-Control:private
Connection:keep-alive
Date:Wed, 11 Jun 2014 09:57:49 GMT
ETag:W/"1242-1389890312000"
Expires:Thu, 01 Jan 1970 00:00:00 GMT
Server:
Header in JMeter:
Thread Name: Thread Group 1-2
Sample Start: 2014-06-11 15:18:56 IST
Load time: 326
Latency: 326
Size in bytes: 1541
Headers size in bytes: 299
Body size in bytes: 1242
Sample Count: 1
Error Count: 0
Response code: 200
Response message: OK
Response headers:
HTTP/1.1 200 OK
Accept-Ranges: bytes
Cache-Control: private
Content-Type: image/png
Date: Wed, 11 Jun 2014 09:48:53 GMT
ETag: W/"1242-1389890312000"
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Last-Modified: Thu, 16 Jan 2014 16:38:32 GMT
Server:
Content-Length: 1242
Connection: keep-alive
Thanks in advance
Have you tried to tick Use Cache Control/Expires header when processing GET requests box which simulates real browser behavior and matching content is returned immediately without actual request being made.
Another possible reason is exceeding Max Number of elements in cache threshold which defaults to 5000.
See Using the HTTP Cache Manager guide for further explanations and recommendations.

Resources