New cabal install repeating error - windows

I've just install the full Haskell platform from https://www.haskell.org/platform/windows.html onto a 64-bit Windows 7 machine. I have followed step 3 from this page.
I had to uninstall 8.0.2 before installing this version, 8.2.1
Whatever command I run with cabal now it gives the same error. Please see the below sequence of commands and errors. This was done in an empty directory:
D:\test>cabal init
dieVerbatim: user error (cabal: Failed to download
http://objects-us-west-1.dream.io/hackage-mirror/root.json : No Status Code
could be parsed from response: --17:29:30--
http://objects-us-west-1.dream.io/hackage-mirror/root.json
=> `C:\Users\BEN~1.CRA\AppData\Local\Temp\transportAdapterGet570528145'
Connecting to objects-us-west-1.dream.io:80... connected!
HTTP request sent, awaiting response... 200 OK
2 Content-Length: 3850
3 Accept-Ranges: bytes
4 Last-Modified: Mon, 12 Sep 2016 12:14:29 GMT
5 ETag: "c5688ef68afb3f6186d35162423bd8c6"
6 x-amz-request-id: tx0000000000000003f6055-0059e0e9ea-19c1b67c-default
7 Content-Type: application/json
8 Date: Fri, 13 Oct 2017 16:29:30 GMT
9 Connection: keep-alive
10
0K ... 100% # 3.67 MB/s
17:29:30 (3.67 MB/s) -
`C:\Users\BEN~1.CRA\AppData\Local\Temp\transportAdapterGet570528145' saved
[3850/3850]
FINISHED --17:29:30--
Downloaded: 3,850 bytes in 1 files
)
D:\test>cabal configure
dieVerbatim: user error (cabal: Failed to download
http://objects-us-west-1.dream.io/hackage-mirror/root.json : No Status Code
could be parsed from response: --17:29:35--
http://objects-us-west-1.dream.io/hackage-mirror/root.json
=> `C:\Users\BEN~1.CRA\AppData\Local\Temp\transportAdapterGet570528145'
Connecting to objects-us-west-1.dream.io:80... connected!
HTTP request sent, awaiting response... 200 OK
2 Content-Length: 3850
3 Accept-Ranges: bytes
4 Last-Modified: Mon, 12 Sep 2016 12:14:29 GMT
5 ETag: "c5688ef68afb3f6186d35162423bd8c6"
6 x-amz-request-id: tx0000000000000001249f3-0059e0e9f0-19c8c27c-default
7 Content-Type: application/json
8 Date: Fri, 13 Oct 2017 16:29:36 GMT
9 Connection: keep-alive
10
0K ... 100% # 3.67 MB/s
17:29:36 (3.67 MB/s) -
`C:\Users\BEN~1.CRA\AppData\Local\Temp\transportAdapterGet570528145' saved
[3850/3850]
FINISHED --17:29:36--
Downloaded: 3,850 bytes in 1 files
)
D:\test>cabal install cabal-install
dieVerbatim: user error (cabal: Failed to download
http://objects-us-west-1.dream.io/hackage-mirror/root.json : No Status Code
could be parsed from response: --17:29:45--
http://objects-us-west-1.dream.io/hackage-mirror/root.json
=> `C:\Users\BEN~1.CRA\AppData\Local\Temp\transportAdapterGet299511942'
Connecting to objects-us-west-1.dream.io:80... connected!
HTTP request sent, awaiting response... 200 OK
2 Content-Length: 3850
3 Accept-Ranges: bytes
4 Last-Modified: Mon, 12 Sep 2016 12:14:29 GMT
5 ETag: "c5688ef68afb3f6186d35162423bd8c6"
6 x-amz-request-id: tx0000000000000003f626b-0059e0e9f9-19c1b67c-default
7 Content-Type: application/json
8 Date: Fri, 13 Oct 2017 16:29:45 GMT
9 Connection: keep-alive
10
0K ... 100% # 3.67 MB/s
17:29:45 (3.67 MB/s) -
`C:\Users\BEN~1.CRA\AppData\Local\Temp\transportAdapterGet299511942' saved
[3850/3850]
FINISHED --17:29:45--
Downloaded: 3,850 bytes in 1 files
)
The temporary file(s) it is referring to does not exist. I am able to access the dream.io from my machine using Chrome.
I am new to Haskell/cabal development, so apologies if I'm missing something obvious.

Reposting the correct answer as communitywiki for posterity:
https://github.com/haskell/cabal/issues/4747#issuecomment-327888643 has some suggested workarounds, including cabal --http-transport=plain-http update and "Adding wget or curl to Path"

Related

webhdfs is redirecting to localhost:50075

I am trying to create a file from a non-hadoop environment to a remote hdfs.
For this purpose, I am using pywebhdfs api and I'm running command using curl.
https://pythonhosted.org/pywebhdfs/
I used this documentation as a reference, I am able to execute all other methods except of create_file().
While using create_file(), I am getting error like: 'Couldn't connect to host'
Command: curl -i -X PUT -L "http://xxx.xxx.xxx.xxx:50070/webhdfs/v1/test1/?op=CREATE" -T sample.txt
Response:HTTP/1.1 100 Continue
HTTP/1.1 307 TEMPORARY_REDIRECT
Cache-Control: no-cache
Expires: Tue, 30 Oct 2018 12:04:04 GMT
Date: Tue, 30 Oct 2018 12:04:04 GMT
Pragma: no-cache
Expires: Tue, 30 Oct 2018 12:04:04 GMT
Date: Tue, 30 Oct 2018 12:04:04 GMT
Pragma: no-cache
Content-Type: application/octet-stream
Location: http://localhost:50075/webhdfs/v1/test1/?op=CREATE&namenoderpcaddress=xxx.xxx.xxx.xxx:9000&overwrite=false
Content-Length: 0
Server: Jetty(6.1.26)
curl: (7) couldn't connect to host
This Location is displaying as localhost here. I took reference from the past post.
webhdfs always redirect to localhost:50075
but I didn't get success.
I tried changing IP in hdfs-site.xml and /etc/hosts file but no success at all.
Can anyone tell me, how to fix this?
Thanks in advance..

WebHDFS OPEN command returns empty results

I created a simple file in HDFS at the path /user/admin/foo.txt
I can see the contents of this file in Hue.
How I issue the command
curl -i http://namenode:50070/webhdfs/v1/user/admin/foo.txt?op=OPEN
I get the response
HTTP/1.1 307 TEMPORARY_REDIRECT
Cache-Control: no-cache
Expires: Tue, 24 Nov 2015 16:20:15 GMT
Date: Tue, 24 Nov 2015 16:20:15 GMT
Pragma: no-cache
Expires: Tue, 24 Nov 2015 16:20:15 GMT
Date: Tue, 24 Nov 2015 16:20:15 GMT
Pragma: no-cache
Location: http://datanode:50075/webhdfs/v1/user/admin/foo.txt?op=OPEN&namenoderpcaddress=nameservice1&offset=0
Content-Type: application/octet-stream
Content-Length: 0
Server: Jetty(6.1.26.cloudera.4)
why is the content-length: 0?? I was hoping that this would list the contents of the file.
Execute:
curl -i http://datanode:50075/webhdfs/v1/user/admin/foo.txt?op=OPEN&namenoderpcaddress=nameservice1&offset=0
As for the explanation - when using WebHDFS to open a file you have to do the following:
You don't know which node the file resides on, so you ask the namenode.
The namenode returns you a datanode containing the file.
You can then open the file itself by talking directly to the datanode.
So this activity is expected. See https://hadoop.apache.org/docs/r1.0.4/webhdfs.html for more information.

Jmeter- HTTP Cache Manager, Unable to cache everything what it is being cached by Browser

I used HTTP Chache Manager to Cache files which are being cached in browser. I am successful of doing it for some of the pages. Number of files being cached in Jmeter is equal to Number of files being cached by browser.
But in some cases :
I found number files being cached is lesser than the files being cached by browser.
Using Jmeter I found only 5 files are being cached but in real browser 12 files are getting cached.
Header for one file which is cached in Chrome but not in Jmeter
Header in Chrome Browser:
Remote Address:
Request URL:
Request Method:GET
Status Code:304 Not Modified
Request Headersview source
Accept:image/webp,/;q=0.8
Accept-Encoding:gzip,deflate,sdch
Accept-Language:en-GB,en;q=0.8,it-CH;q=0.6,it;q=0.4,ar;q=0.2
Cache-Control:max-age=0
Connection:keep-alive
Cookie:
Host:
If-Modified-Since:Thu, 16 Jan 2014 16:38:32 GMT
If-None-Match:W/"1242-1389890312000"
Referer:
User-Agent:Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/35.0.1916.114 Safari/537.36
Response Headersview source
Cache-Control:private
Connection:keep-alive
Date:Wed, 11 Jun 2014 09:57:49 GMT
ETag:W/"1242-1389890312000"
Expires:Thu, 01 Jan 1970 00:00:00 GMT
Server:
Header in JMeter:
Thread Name: Thread Group 1-2
Sample Start: 2014-06-11 15:18:56 IST
Load time: 326
Latency: 326
Size in bytes: 1541
Headers size in bytes: 299
Body size in bytes: 1242
Sample Count: 1
Error Count: 0
Response code: 200
Response message: OK
Response headers:
HTTP/1.1 200 OK
Accept-Ranges: bytes
Cache-Control: private
Content-Type: image/png
Date: Wed, 11 Jun 2014 09:48:53 GMT
ETag: W/"1242-1389890312000"
Expires: Thu, 01 Jan 1970 00:00:00 GMT
Last-Modified: Thu, 16 Jan 2014 16:38:32 GMT
Server:
Content-Length: 1242
Connection: keep-alive
Thanks in advance
Have you tried to tick Use Cache Control/Expires header when processing GET requests box which simulates real browser behavior and matching content is returned immediately without actual request being made.
Another possible reason is exceeding Max Number of elements in cache threshold which defaults to 5000.
See Using the HTTP Cache Manager guide for further explanations and recommendations.

Gradle download timeout/retry

I am on a flaky network (or there is some kind of proxy or virus checker in the way) so my gradle dependencies downloads (external module dependencies mavenCentral()) hangs sometimes.
A local repo would help, but are there any settings for timeouts and retries?
The download starts, and then it hangs, and times out after the default socket timeout,
I can emulate this with wget
wget -d
http://repo1.maven.org/maven2/org/apache/santuario/xmlsec/1.5.2/xmlsec-1.5.2-sources.jar
DEBUG output created by Wget 1.11.4 on Windows-MSVC.
--2013-01-23 13:52:01-- http://repo1.maven.org/maven2/org/apache/santuario/xmls
ec/1.5.2/xmlsec-1.5.2-sources.jar Resolving repo1.maven.org... seconds
0.00, 68.232.34.223 Caching repo1.maven.org => 68.232.34.223 Connecting to repo1.maven.org|68.232.34.223|:80... seconds 0.00,
connected. Created socket 352. Releasing 0x003311d0 (new refcount 1).
---request begin--- GET /maven2/org/apache/santuario/xmlsec/1.5.2/xmlsec-1.5.2-sources.jar
HTTP/1.0 User-Agent: Wget/1.11.4 Accept: / Host: repo1.maven.org
Connection: Keep-Alive
---request end--- HTTP request sent, awaiting response...
---response begin--- HTTP/1.0 200 OK Accept-Ranges: bytes Content-Type: application/java-archive Date: Wed, 23 Jan 2013 12:52:01
GMT Last-Modified: Mon, 14 May 2012 08:47:03 GMT Server: ECAcc
(lhr/4ABA) X-Cache: HIT Content-Length: 577534 Connection: keep-alive
---response end--- 200 OK Registered socket 352 for persistent reuse. Length: 577534 (564K) [application/java-archive] Saving to:
`xmlsec-1.5.2-sources.jar.1'
5% [=> ] 33,328 --.-K/s eta
17m 52s ^
I would like it to timeout faster and retry the download,

GZIP for CSS and JS dosen't work in Google Chrome, but does in Firefox

I have recently figured out how to enable GZIP (or Deflate) on my WAMP server that I will be using to serve my intranet application.
However, when testing in Google Chrome I see that the PHP file is compressed but Javascript files and CSS are not. The Response header shows that it is not compressed and Google Pagespeed confirms this.
Firefox on the other hand recives all files with compression without a problem.
Here are the headers for the main CSS file as an example:
Google Chrome
Date: Wed, 18 Jul 2012 14:48:43 GMT
Content-Length: 6533
Last-Modified: Wed, 18 Jul 2012 00:42:33 GMT
Server: Apache/2.2.21 (Win64) PHP/5.3.10
ETag: "a00000001509b-1985-4c50ff04b26ef"
Vary: Accept-Encoding
Content-Type: text/css
Accept-Ranges: bytes
200 OK
Firefox
Date: Wed, 18 Jul 2012 14:33:14 GMT
Server: Apache/2.2.21 (Win64) PHP/5.3.10
Last-Modified: Wed, 18 Jul 2012 00:42:33 GMT
Etag: "a00000001509b-1985-4c50ff04b26ef"
Accept-Ranges: bytes
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 1273
Content-Type: text/css
200 OK
Is this a problem with my WAMP setup, code, or is it just Google Chrome?
Thank you.
Google chrome does support JS/CSS gzip.
Request URL:http://d1o7y22ifnbryp.cloudfront.net/static/7793/css/all.min.css
Request Method:GET
Status Code:200 OK
Request Headersview source
Accept:text/css,*/*;q=0.1
Accept-Charset:GBK,utf-8;q=0.7,*;q=0.3
Accept-Encoding:gzip,deflate,sdch
Accept-Language:zh-CN,zh;q=0.8
Connection:keep-alive
Host:d1o7y22ifnbryp.cloudfront.net
User-Agent:Mozilla/5.0 (Windows NT 6.1) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.52 Safari/536.5
Response Headers
Accept-Ranges:bytes
Cache-Control:max-age=604800
Connection:keep-alive
Content-Encoding:gzip
Content-Length:17933
Content-Type:text/css
Date:Wed, 18 Jul 2012 15:43:46 GMT
ETag:"805dc-19c87-4c4a305735340"
Expires:Wed, 25 Jul 2012 15:43:46 GMT
Last-Modified:Thu, 12 Jul 2012 14:45:57 GMT
Server:Apache/2.2.22 (Amazon)
Vary:Accept-Encoding
I think the problem should be in apache configuration.

Resources