using lftp to upload to ftp site got 501 Insufficient disk space - ftp

I'm new to using ftp and recently i came across this really wired situation.
I was trying to upload a file to someone else's ftp site, and i tried to use this command
lftp -e "set ftp:passive-mode true; put /dir/to/myfile -o dest_folder/`basename /dir/to/myfile`; bye" ftp://userName:passWord#ftp.site.com
but i got the error
put: Access failed: 501 Insufficient disk space : only 0 bytes available. (To dest_folder/myfile)
and when i log on to their site and check, a 0 byte file with myfile name is uploaded.
At first i thought the ftp site is out of disk space, but i then tried log on to the site using
lftp userName:passWord#ftp.site.com
and then set passive mode
set ftp:passive-mode true
and then upload the file(using another name)
put /dir/to/myfile_1 -o dest_folder/`basename /dir/to/myfile_1`
this time the file was successfully uploaded without the 501 insufficient disk space error.
Does any one know why this happens? Thanks!

You might try using lftp -d, to enable the debug/verbose mode. Some FTP clients use the ALLO FTP command, to tell the FTP server to "allocate" some amount of bytes in advance; the FTP server can then accept/reject that. I suspect that lftp is sending ALLO to your FTP server, and it is the FTP server responding to that ALLO command with a 501 response code, causing your issue.
Per updates/comments, the OP confirmed that lftp's use of ALLO was indeed resulting in the initially reported behaviors. Subsequent errors happened because lftp was attempting to update the timestamp of the uploaded file; these attempts were also being rejected by the FTP server. lftp had tried using the MFMT and SITE UTIME FTP commands.
To disable those, and to get lftp to succeed for the OP, the following lftp settings were needed:
ftp:trust-feat no
ftp:use-allo no
ftp:use-feat no
ftp:use-site-utime no
ftp:use-site-utime2 no
With these settings, you should be able to have lftp upload a file without using the ALLO command beforehand, and without trying to modifying the server-side timestamp of the uploaded file using MFMT or SITE UTIME.
Hope this helps!

Related

Download files with lftp seems not aware of disk full at my side. How can I fix that?

Using lftp I download files from a remote server (mget -E) to my local server. When my local disk is full, I would think lftp would get an error from the OS (CentOS7) and try later to download or complete the download. Instead, lftp just got on writing 0 byte files at my side. Is there anything I can do to let lftp stop with an error when my local disk has 0 bytes free?
Maybe:
Settings: On startup, lftp executes ~/.lftprc and ~/.lftp/rc (or ~/.config/lftp/rc if ~/.lftp does not exist). You can place aliases and 'set' commands there. Some people prefer to see full protocol debug: use 'debug' to turn on debugging.
In Settings -> xfer:disk-full-fatal (boolean) - when true, lftp aborts a transfer if it cannot write the target file to disk because of full disk or quota; when false, lftp waits for disk space to be freed.
This is from the (lengthy) lftp manual; one version here: https://linux.die.net/man/1/lftp

Transferring large files using SFTP using linux bash scripts

I am intending to send a huge file around 1+GB over to the remote side using SFTP. However, it seems to work fine in interactive mode(when I sftp#xx.xx.xx.xx and enter the password manually, then I key in the put command). But when I run it in shell, it always timeout.
I have set the client and server ClientAliveTimeout settings at /etc/ssh/sshd_config but it still occurs.
Below is the linux script code
sshpass -p "password" sftp user#xx.xx.xx.xx << END
put <local file path> <remote file path>
exit
END
The transfer of files takes 10 min when using interactive mode
when run using script, the file was incomplete based on filesize.
Update: Current transfer during interactive mode shows the small files went through but the big file was stalled halfway during transfer.
I prefere lftp for such things
lftp -u user,passwd domain.tld -e "put /path/file; quit"
lftp can handle sftp too
open sftp://username:password#server.address.com

How to convert Windows FTP script to WinSCP?

I need to use WinSCP in my legacy vb6 code. I always used my script like this:
open ftp.myserver.com
myusername
mypassword
passive on
cd myfolder\
ascii
put C:\temp\test.xml test.xml
close
quit
Similar script (with little change) always worked for sslftp, psftp, etc.
Now I need to create script to make this WinSCP work but it keeps throwing "host not found error". I'm not even trying the script. I'm trying it on the command window.
winscp> open ftp.myserver.com
Searching for host...
Network error: Connection timed out.
same ftp works on regular command line ftp:
ftp> open ftp.myserver.com
Connected to myserver.
220 Gene6 FTP Server v3.10.0
User (...): myuser
331 Password required for myuser
Password:
230 User manager logged in.
How do I run WinSCP? The documentation doesn't show any such example.
WinSCP defaults to the SFTP protocol on the port 22. If you want to use the FTP, you need to specify that explicitly.
Also username and password come in session URL as part of the open command, not on separate lines. The passive mode is specified using the -passive=on switch.
open ftp://myusername:mypassword#ftp.myserver.com -passive=on
The ascii mode is specified using the -transfer=ascii switch of the put command (though the separate ascii command is also understood for compatibility):
put -transfer=ascii C:\temp\test.xml test.xml
It's the exit, not the quit.
See the complete guide for converting Windows FTP script to WinSCP.
You should also read the guide to automating file transfers to FTP server.

Cronjob not executing the Shell Script completely

This is Srikanth from Hyderabad.
I the Linux Administrator in one of the corporate company. We have a squid server, So i prepared a Backup squid server, so that when LIVE Squid server goes down i can put the backup server into LIVE.
My squid servers are configured with Centos 5.5. I have prepared a script to take backup of all configuration files in /etc/squid/ of LIVE server to the backup server. i.e It will copy all files from Live server's /etc/squid/ to backup server's /etc/squid/
Here's the script saved as squidbackup.sh in the directory /opt/ with permission 755(rwxr-xr-x)
#! /bin/sh
username="<username>"
password="<password>"
host="Server IP"
expect -c "
spawn /usr/bin/scp -r <username>#Server IP:/etc/squid /etc/
expect {
"*password:*"{
send $password\r;
interact;
}
eof{
exit
}
}
** Kindly note that this will be executed in the backup server that will check for the user which is mentioned in the script. I have created a user in the live server and given the same in the script too.
When i execute this command using the below command
[root#localhost ~]# sh /opt/squidbackup.sh
Everything works fine till now, this script downloads all the files from the directory /etc/squid/ of LIVE server to the location /etc/squid/ of Backup server
Now the problem raises, If i set this in crontab like below or with other timings
50 23 * * * sh /opt/squidbackup.sh
Dont know what's wrong, it is not downloading all files. i.e Cronjob is downloading only few files from /etc/squid/ of LIVE server to the /etc/squid/ of backup server.
**Only few files are downloaded when cron executes the script, If i run this script manually then it is downloading all files perfectly with out any errors or warnings.
If you have any more questions, Please go ahead to post it.
Now i kindly request to give if any solutions are available.
Please Please, Thank you in advance.
thanks for your interest. I have tried what you have said, it show like below, but previously i use to get the same output to mail of the User in the squid backup server.
Even in cron logs it show the same, but i was not able to understand what was the exact error from the below lines.
Please note that only few files are getting downloaded with cron.
spawn /usr/bin/scp -r <username>#ServerIP:/etc/squid /etc/
<username>#ServerIP's password:
Kindly check if you can suggest any thing else.
Try the simple options first. Capture the stdout and stderr as shown below. These files should point to the problem.
Looking at the script, you need to specify the location of expect. That could be an issue.
50 23 * * * sh /opt/squidbackup.sh >/tmp/cronout.log 2>&1

How would I construct a terminal command to download a folder with wget from a Media Temple (gs) server?

I'm trying to download a folder using wget on the Terminal (I'm usin a Mac if that matters) because my ftp client sucks and keeps timing out. It doesn't stay connected for long. So I was wondering if I could use wget to connect via ftp protocol to the server to download the directory in question. I have searched around in the internet for this and have attempted to write the command but it keeps failing. So assuming the following:
ftp username is: serveradmin#mydomain.ca
ftp host is: ftp.s12345.gridserver.com
ftp password is: somepassword
I have tried to write the command in the following ways:
wget -r ftp://serveradmin#mydomain.ca:somepassword#s12345.gridserver.com/path/to/desired/folder/
wget -r ftp://serveradmin:somepassword#s12345.gridserver.com/path/to/desired/folder/
When I try the first way I get this error:
Bad port number.
When I try the second way I get a little further but I get this error:
Resolving s12345.gridserver.com... 71.46.226.79
Connecting to s12345.gridserver.com|71.46.226.79|:21... connected.
Logging in as serveradmin ...
Login incorrect.
What could I be doing wrong?
Use scp on the Mac instead, it will probably work much nicer.
scp -r user#mediatemplehost.net:/folder/path /local/path

Resources