When I attempt to download from dl.google.com I receive this error :
ERROR: The certificate of `dl.google.com' is not trusted.
ERROR: The certificate of `dl.google.com' hasn't got a known issuer.
Here is entire command output
$ curl https://dl.google.com/dl/cloudsdk/release/install_google_cloud_sdk.bash
| bash
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 3607 100 3607 0 0 2820 0 0:00:01 0:00:01 --:--:-- 3125
bash: line 77: [: Files: binary operator expected
wget -O - https://dl.google.com/dl/cloudsdk/release/google-cloud-sdk.tar.gz > tm
p.4wwaU246zk/google-cloud-sdk.tar.gz
--2013-12-12 11:05:41-- https://dl.google.com/dl/cloudsdk/release/google-cloud-
sdk.tar.gz
Resolving my.proxy.com my.proxy.com)... x.x.x.x
Connecting to my.proxy.com (my.proxy.com)|x.x.x.x|:1234... conne
cted.
ERROR: The certificate of `dl.google.com' is not trusted.
ERROR: The certificate of `dl.google.com' hasn't got a known issuer.
Reading this question : How do I fix certificate errors when running wget on an HTTPS URL in Cygwin? an option is to
"add the --no-check-certificate option on the wget command-line" but since I'm using curl instead of wget is there a similar option for above command ?
Update : I've tried
curl -k https://dl.google.com/dl/cloudsdk/release/install_google_cloud_sdk.ba
sh | bash
But same error, could the proxy/firewall be blocking the connection ?
Original Answer:
You are searching for -k or (long) --insecure .. The man page is your friend ;) :
-k, --insecure
(SSL) This option explicitly allows curl to perform "insecure" SSL connections and transfers.
All SSL connections are attempted to be made secure by using the CA certificate bundle installed
by default. This makes all connections considered "insecure" fail unless -k, --insecure is used.
See this online resource for further details: http://curl.haxx.se/docs/sslcerts.html
Edit after update the question:
You showed that you are already using the -k option here. I had a deeper look into your code and the task to be done:
You are trying to download a shell script from google servers. They will have trusted certificate, means you need to remove the -k as it is insecure (like the name).
After download you piping the script directly to bash. So the first question is: Did the download of the script succeed? (Can you post the script to some pastebin, in order to make it possible to verify this for me?) Will go on explaining after this questions have been answered
Related
I have been following the steps of the courses pre-work, including:
checking for, generating, copy/paste, and
saving the SSH keys to GitHub.
But when I am instructed to check the matching fingerprints using "ssh -T git#github.com", the prints don't match.
I've even started from the beginning clear through, but they still don't match.
Thought I'd reach out here before using my 1 tutoring.
Hopefully the screenshot showing what I see helps(link).
EDIT- I understand there's some stuff in there that shouldn't be, I was just trying things for diff results. I would just like to know where I went wrong and how to avoid it.
What you ssh is the remote site SSH key fingerprint, not you registered SSH key fingerprint.
You see (or should see if you are contacting the correct github.com) the fingerprints exposed with api.github.com/meta as explained here.
Using jq, you can add them to your ~/.ssh/known_hosts with:
curl --silent https://api.github.com/meta \
| jq --raw-output '"github.com "+.ssh_keys[]' >> ~/.ssh/known_hosts
From there, you can test your connection with ssh -Tv github.com, and check if you see a welcome message:
Hi username!
You've successfully authenticated, but GitHub does not provide shell access
The following command works on the command line:
wget --secure-protocol=PFS -O dcm4chee-arc-5.15.1-mysql.zip https://sourceforge.net/projects/dcm4che/files/dcm4chee-arc-light5/5.15.1/dcm4chee-arc-5.15.1-mysql.zip/download
However, when I put the exact same line into a bash script (it's inside a function), it results in this error:
Resolving sourceforge.net (sourceforge.net)... 216.105.38.13
Connecting to sourceforge.net (sourceforge.net)|216.105.38.13|:443... connected.
Unable to establish SSL connection.
I've even pulled it out of the function to see if that makes any difference, but it doesn't.
Any thoughts?
Kicking myself ... my IDE had #!/usr/bin/env bash at the top of the file, whereas changing it to simply #!/bin/bash made everything work as expected. I thank you all for the responses, crediting #Mihai with direct assistance due to the comment about "environment"
Trying to create a function to check if there are issues with the SSL on a webpage.In the specific scenario we've setup the expected output for curl https://domain includes:
curl: (60) SSL certificate problem: self signed certificate
...we are using grep, as per the line below to to set the SSL_STATUS variable to that line which we will then pump through an if statement. Problem is that it sets the variable and then drops out of the script for no apparent reason:
+ https_status
++ curl https://steelrain.eu
++ grep 'SSL certificate problem'
+ SSL_STATUS='curl: (60) SSL certificate problem: self signed certificate'
Having tested this is not the result of the grep but curl, and I do not know why (it still occurred when using SSL_STATUS=$( curl https://${DOMAIN} ) which is the basis).
I might just not be understanding how something works here because I'm thick but any assistance would be appreciated.
SSL_STATUS=$( curl https://${DOMAIN} 2>&1 | grep "SSL certificate problem" )
probably should have mentioned before but setting the function to just run the curl command drops it out of the script too so it's not setting the output to a variable that's causing trouble.
set -e was in the main script for debugging:
set -euox pipefail
Commented out and now it's sorted. Cheers!
I have a bash script that downloads some files from an ftp server. the problem is that sometimes curl returns errors 6 (can't resolve host) randomly! I can open the ftp via web browser without any problem. I also noticed that the most errors occurs on the first downloads. any idea?
Also I wanted to know that how can I make curl to retry download when these errors occur
Code I used:
curl -m 60 --retry 10 --retry-delay 10 --ftp-method multicwd -C - ftp://some_address/some_file --output ./some_file
note: I also tried the code without --ftp-method multicwd
OS: CentOS 6.5 64bit
while [ "$ret" != "0" ]; do curl [your options]; ret=$?; sleep 5; done
Assuming those are transitional problems with the server and/or DNS, looping might be of some help. This is a particularly good case for the rarely used (?) until loop:
until curl [your options]; do sleep 5; done
In addition, if using curl is not mandatory, maybe wget might be better suited for "unreliable" network connections. From the man:
GNU Wget is a free utility for non-interactive download of files from
the Web. It supports HTTP, HTTPS, and FTP protocols, as well as
retrieval through HTTP proxies.
[...]
Wget has been designed for robustness over slow or unstable network connections; if a download fails due to
a network problem, it will keep retrying until the whole file has been retrieved. If the server supports
regetting, it will instruct the server to continue the download from where it left off.
My provider currently only provides FTPS as a means of uploading files to the server.
Now I want to publish files from Jenkins to that server. I can access the server using an FTP client that supports FTPS but neither of the FTP-Publisher plugins, seem to be able to publish using FTPS.
The only reference for FTPS and Jenkins that I found was this open bug.
I know that SSH would be a good option, but since my hosting provider does not support this I wonder how I can efficiently upload files to my server through jenkins.
My jenkins server runs on OSX.
Update: According to my own answer below I tried CURL but got a generic error:
curl -v -T index.html ftps://myusername:mypassword#myserver.com:21/www/
Adding handle: conn: 0x7fa9d500cc00
Adding handle: send: 0
Adding handle: recv: 0
Curl_addHandleToPipeline: length: 1
Conn 0 (0x7fa9d500cc00) send_pipe: 1, recv_pipe: 0
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0*
About to connect() to myserver.com port 21 (#0)
Trying xx.xx.xx.xx...
Connected to myserver.com (xx.xx.xx.xx) port 21 (#0)
Unknown SSL protocol error in connection to myserver.com:-9800
Closing connection 0
curl: (35) Unknown SSL protocol error in connection to myserver.com:-9800
There are currently no Jenkins plugins that will handle FTPS (FTP over SSL). Instead the cURL program is capable of uploading with FTPS.
First check that cURL is installed on the Jenkins host.
On a linux environment try the command:
which curl
Now ensure that cURL is in the path for the Jenkins user account. Alternatively fully qualify the path to cURL.
Now using a post build step, task, or with the promoted builds plugin add a shell script that contains the following:
FILEPATH=$WORKSPACE/path/to/some/file
REMOTEPATH=/new/path/for/file
curl -T $FILEPATH -u username:password ftps://myserver.com$REMOTEPATH
Correct the $FILEPATH and $REMOTEPATH to reflect the environment.
Example:
FILEPATH=$WORKSPACE/index.html
REMOTEPATH=/www/index.html
If a self signed certificate is in use on the remote host then cURL needs to skip verification. This is done with the -k parameter.
curl -T $FILEPATH -u username:password -k ftps://myserver.com$REMOTEPATH
One way of uploading might be to do this via CURL, which is not the best of options since I would rather use a Jenkins Plugin, but at least this would allow me to do it for the time being.
From the Curl docs
UPLOADING
FTP / FTPS / SFTP / SCP
Upload all data on stdin to a specified server:
curl -T - ftp://ftp.upload.com/myfile
Upload data from a specified file, login with user and password:
curl -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile
Upload a local file to the remote site, and use the local file name at the remote site too:
curl -T uploadfile -u user:passwd ftp://ftp.upload.com/
Upload a local file to get appended to the remote file:
curl -T localfile -a ftp://ftp.upload.com/remotefile
Note that using FTPS:// as prefix is the "implicit" way as described in the
standards while the recommended "explicit" way is done by using FTP:// and
the --ftp-ssl option.