Publish to FTPS using Jenkins - macos

My provider currently only provides FTPS as a means of uploading files to the server.
Now I want to publish files from Jenkins to that server. I can access the server using an FTP client that supports FTPS but neither of the FTP-Publisher plugins, seem to be able to publish using FTPS.
The only reference for FTPS and Jenkins that I found was this open bug.
I know that SSH would be a good option, but since my hosting provider does not support this I wonder how I can efficiently upload files to my server through jenkins.
My jenkins server runs on OSX.
Update: According to my own answer below I tried CURL but got a generic error:
curl -v -T index.html ftps://myusername:mypassword#myserver.com:21/www/
Adding handle: conn: 0x7fa9d500cc00
Adding handle: send: 0
Adding handle: recv: 0
Curl_addHandleToPipeline: length: 1
Conn 0 (0x7fa9d500cc00) send_pipe: 1, recv_pipe: 0
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0*
About to connect() to myserver.com port 21 (#0)
Trying xx.xx.xx.xx...
Connected to myserver.com (xx.xx.xx.xx) port 21 (#0)
Unknown SSL protocol error in connection to myserver.com:-9800
Closing connection 0
curl: (35) Unknown SSL protocol error in connection to myserver.com:-9800

There are currently no Jenkins plugins that will handle FTPS (FTP over SSL). Instead the cURL program is capable of uploading with FTPS.
First check that cURL is installed on the Jenkins host.
On a linux environment try the command:
which curl
Now ensure that cURL is in the path for the Jenkins user account. Alternatively fully qualify the path to cURL.
Now using a post build step, task, or with the promoted builds plugin add a shell script that contains the following:
FILEPATH=$WORKSPACE/path/to/some/file
REMOTEPATH=/new/path/for/file
curl -T $FILEPATH -u username:password ftps://myserver.com$REMOTEPATH
Correct the $FILEPATH and $REMOTEPATH to reflect the environment.
Example:
FILEPATH=$WORKSPACE/index.html
REMOTEPATH=/www/index.html
If a self signed certificate is in use on the remote host then cURL needs to skip verification. This is done with the -k parameter.
curl -T $FILEPATH -u username:password -k ftps://myserver.com$REMOTEPATH

One way of uploading might be to do this via CURL, which is not the best of options since I would rather use a Jenkins Plugin, but at least this would allow me to do it for the time being.
From the Curl docs
UPLOADING
FTP / FTPS / SFTP / SCP
Upload all data on stdin to a specified server:
curl -T - ftp://ftp.upload.com/myfile
Upload data from a specified file, login with user and password:
curl -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile
Upload a local file to the remote site, and use the local file name at the remote site too:
curl -T uploadfile -u user:passwd ftp://ftp.upload.com/
Upload a local file to get appended to the remote file:
curl -T localfile -a ftp://ftp.upload.com/remotefile
Note that using FTPS:// as prefix is the "implicit" way as described in the
standards while the recommended "explicit" way is done by using FTP:// and
the --ftp-ssl option.

Related

using lftp to upload to ftp site got 501 Insufficient disk space

I'm new to using ftp and recently i came across this really wired situation.
I was trying to upload a file to someone else's ftp site, and i tried to use this command
lftp -e "set ftp:passive-mode true; put /dir/to/myfile -o dest_folder/`basename /dir/to/myfile`; bye" ftp://userName:passWord#ftp.site.com
but i got the error
put: Access failed: 501 Insufficient disk space : only 0 bytes available. (To dest_folder/myfile)
and when i log on to their site and check, a 0 byte file with myfile name is uploaded.
At first i thought the ftp site is out of disk space, but i then tried log on to the site using
lftp userName:passWord#ftp.site.com
and then set passive mode
set ftp:passive-mode true
and then upload the file(using another name)
put /dir/to/myfile_1 -o dest_folder/`basename /dir/to/myfile_1`
this time the file was successfully uploaded without the 501 insufficient disk space error.
Does any one know why this happens? Thanks!
You might try using lftp -d, to enable the debug/verbose mode. Some FTP clients use the ALLO FTP command, to tell the FTP server to "allocate" some amount of bytes in advance; the FTP server can then accept/reject that. I suspect that lftp is sending ALLO to your FTP server, and it is the FTP server responding to that ALLO command with a 501 response code, causing your issue.
Per updates/comments, the OP confirmed that lftp's use of ALLO was indeed resulting in the initially reported behaviors. Subsequent errors happened because lftp was attempting to update the timestamp of the uploaded file; these attempts were also being rejected by the FTP server. lftp had tried using the MFMT and SITE UTIME FTP commands.
To disable those, and to get lftp to succeed for the OP, the following lftp settings were needed:
ftp:trust-feat no
ftp:use-allo no
ftp:use-feat no
ftp:use-site-utime no
ftp:use-site-utime2 no
With these settings, you should be able to have lftp upload a file without using the ALLO command beforehand, and without trying to modifying the server-side timestamp of the uploaded file using MFMT or SITE UTIME.
Hope this helps!

curl returns error (6) occasionally

I have a bash script that downloads some files from an ftp server. the problem is that sometimes curl returns errors 6 (can't resolve host) randomly! I can open the ftp via web browser without any problem. I also noticed that the most errors occurs on the first downloads. any idea?
Also I wanted to know that how can I make curl to retry download when these errors occur
Code I used:
curl -m 60 --retry 10 --retry-delay 10 --ftp-method multicwd -C - ftp://some_address/some_file --output ./some_file
note: I also tried the code without --ftp-method multicwd
OS: CentOS 6.5 64bit
while [ "$ret" != "0" ]; do curl [your options]; ret=$?; sleep 5; done
Assuming those are transitional problems with the server and/or DNS, looping might be of some help. This is a particularly good case for the rarely used (?) until loop:
until curl [your options]; do sleep 5; done
In addition, if using curl is not mandatory, maybe wget might be better suited for "unreliable" network connections. From the man:
GNU Wget is a free utility for non-interactive download of files from
the Web. It supports HTTP, HTTPS, and FTP protocols, as well as
retrieval through HTTP proxies.
[...]
Wget has been designed for robustness over slow or unstable network connections; if a download fails due to
a network problem, it will keep retrying until the whole file has been retrieved. If the server supports
regetting, it will instruct the server to continue the download from where it left off.

How to fix certificate errors when using curl?

When I attempt to download from dl.google.com I receive this error :
ERROR: The certificate of `dl.google.com' is not trusted.
ERROR: The certificate of `dl.google.com' hasn't got a known issuer.
Here is entire command output
$ curl https://dl.google.com/dl/cloudsdk/release/install_google_cloud_sdk.bash
| bash
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 3607 100 3607 0 0 2820 0 0:00:01 0:00:01 --:--:-- 3125
bash: line 77: [: Files: binary operator expected
wget -O - https://dl.google.com/dl/cloudsdk/release/google-cloud-sdk.tar.gz > tm
p.4wwaU246zk/google-cloud-sdk.tar.gz
--2013-12-12 11:05:41-- https://dl.google.com/dl/cloudsdk/release/google-cloud-
sdk.tar.gz
Resolving my.proxy.com my.proxy.com)... x.x.x.x
Connecting to my.proxy.com (my.proxy.com)|x.x.x.x|:1234... conne
cted.
ERROR: The certificate of `dl.google.com' is not trusted.
ERROR: The certificate of `dl.google.com' hasn't got a known issuer.
Reading this question : How do I fix certificate errors when running wget on an HTTPS URL in Cygwin? an option is to
"add the --no-check-certificate option on the wget command-line" but since I'm using curl instead of wget is there a similar option for above command ?
Update : I've tried
curl -k https://dl.google.com/dl/cloudsdk/release/install_google_cloud_sdk.ba
sh | bash
But same error, could the proxy/firewall be blocking the connection ?
Original Answer:
You are searching for -k or (long) --insecure .. The man page is your friend ;) :
-k, --insecure
(SSL) This option explicitly allows curl to perform "insecure" SSL connections and transfers.
All SSL connections are attempted to be made secure by using the CA certificate bundle installed
by default. This makes all connections considered "insecure" fail unless -k, --insecure is used.
See this online resource for further details: http://curl.haxx.se/docs/sslcerts.html
Edit after update the question:
You showed that you are already using the -k option here. I had a deeper look into your code and the task to be done:
You are trying to download a shell script from google servers. They will have trusted certificate, means you need to remove the -k as it is insecure (like the name).
After download you piping the script directly to bash. So the first question is: Did the download of the script succeed? (Can you post the script to some pastebin, in order to make it possible to verify this for me?) Will go on explaining after this questions have been answered

How to debug a ssh tunnel

I want to setup a simple ssh tunnel from a local machine to a machine on the internet.
I'm using
ssh -D 8080 -f -C -q -N -p 12122 <username>#<hostname>
Setup works fine (I think) cause ssh returs asking for the credentials, which I provide.
Then i do
export http_proxy=http://localhost:8080
and
wget http://www.google.com
Wget returns that the request has been sent to the proxy, but no data is received back.
What i need is a way to look at how ssh is processing the request....
To get more information out of your SSH connection for debugging, leave out the -q and -f options, and include -vvv:
ssh -D 8080 -vvv -N -p 12122 <username>#<hostname>
To address your actual problem, by using ssh -D you're essentially setting up a SOCKS proxy which I believe is not supported by default in wget.
You might have better luck with curl which provides SOCKS suport via the --socks option.
If you really really need to use wget, you'll have to recompile your own version to include socks support. There should be an option for ./configure somewhere along the lines of --with-socks.
Alternatively, look into tsock which can intercept outgoing network connections and redirecting them through a SOCKS server.

How would I construct a terminal command to download a folder with wget from a Media Temple (gs) server?

I'm trying to download a folder using wget on the Terminal (I'm usin a Mac if that matters) because my ftp client sucks and keeps timing out. It doesn't stay connected for long. So I was wondering if I could use wget to connect via ftp protocol to the server to download the directory in question. I have searched around in the internet for this and have attempted to write the command but it keeps failing. So assuming the following:
ftp username is: serveradmin#mydomain.ca
ftp host is: ftp.s12345.gridserver.com
ftp password is: somepassword
I have tried to write the command in the following ways:
wget -r ftp://serveradmin#mydomain.ca:somepassword#s12345.gridserver.com/path/to/desired/folder/
wget -r ftp://serveradmin:somepassword#s12345.gridserver.com/path/to/desired/folder/
When I try the first way I get this error:
Bad port number.
When I try the second way I get a little further but I get this error:
Resolving s12345.gridserver.com... 71.46.226.79
Connecting to s12345.gridserver.com|71.46.226.79|:21... connected.
Logging in as serveradmin ...
Login incorrect.
What could I be doing wrong?
Use scp on the Mac instead, it will probably work much nicer.
scp -r user#mediatemplehost.net:/folder/path /local/path

Resources