On FTP server delete file older than x days - ftp

I have access to FTP server, which has files stored on a xyz folder.
I need to delete files on xyz folder on remote FTP server which are more than x days old.
So far i have not arrived at any concrete solution.
Thanks,
Rosh

Related

Nifi: Failed to retrieve directory listing when connecting to FTP server

I have a ListenFTP processor opened on a port and when i am trying to connect to it via FileZila i have an error "Failed to retrieve directory listing".
The connection seems to be establish first but then this error occurs.
Nifi is hosted on an ubuntu server running in a docker image
ListenFTP processor is opened on port 2221
I tried to change some configuration in FileZila based on this issue but nothing worked.
The connection works well on localhost, i can connect to the ftp server and transfer files
Somone has an idea how to solved that ?
If you look at the documentation of the processor, it states that
"After starting the processor and connecting to the FTP server, an
empty root directory is visible in the client application. Folders can
be created in and deleted from the root directory and any of its
subdirectories. Files can be uploaded to any directory. Uploaded files
do not show in the content list of directories, since files are not
actually stored on this FTP server, but converted into FlowFiles and
transferred to the next processor via the 'success' relationship. It
is not possible to download or delete files like on a regular FTP
server. All the folders (including the root directory) are virtual
directories, meaning that they only exist in memory and do not get
created in the file system of the host machine. Also, these
directories are not persisted: by restarting the processor all the
directories (except for the root directory) get removed. Uploaded
files do not get removed by restarting the processor, since they are
not stored on the FTP server, but transferred to the next processor as
FlowFiles."

How can I zip transfer files to using Putty in Windows?

I have a problem regarding transferring files from a server to another server. I tried using PSCP putty. it worked in the first time from local to server. what I'm trying to do is zip all files then transfer to another server. what commands should I use to achieve this?
pscp -P 22 test.zip root#domain.com:/root
this line of code works when transferring local to remote server, however, I want to compress files from a server to another remote server, or at least remote to local, then local to remote, whatever method is possible. I cannot compress the files because it's almost 50 GB in total so I am searching for a much faster way to achieve this.

Remote duplicate on FTP server

I have a server 1 (running Ubuntu), on this server, a website.
I have a server 2 (running Win Server 2012), on that server some application are running and I have space for my backups.
Server 1 has limited space, so I keep backups of both my MySQL database and Webserver file for 1 week only (daily backups).
When doing my daily backup, the script does the following :
- backup MySQL to a file (Mysqldump)
- Compress the Webserver root folder to a tar.gz
- push both generated file to a FTP server (total is 6GB)
- clean for files older than retention period
Now I want to add a step to have a stronger backup policy on server2 (keep daily for 10 days, have a weekly for 5 weeks, a monthly for a year and keep the yearly forever). Each backup interval is in a folder (i.e. a Daily folder, a weekly folder, a monthly folder and a Yearly folder)
I want that every sunday my backup file is copied both in Daily and Weekly folder (each of them being cleaned per policy explained previously and with another schedule task), I do not want to FTP it twice. I want basically from server1 to copy the file from \Server2\Daily to \Server2\Weekly.
Is RCP the right thing to use? I could not find how to use it with password.
well, some more research suggested me to go to a web service, so I ended up with the following setup.
in my cron job on Server1, after pushing the backed up files to the FTP server, I call (using curl) a php script on Server2, this PHP script will then call a batch file to do the copy/duplication job all on Server2.

Ftp access denied to file after failed upload

I'm uploading file to ftp. Sometimes upload fails and on ftp chunk of a file leaves. e.g. 20 mb from file that have 50mb. Then I receive "550 Access is denied" when trying to re-upload file or delete that chunk of trash.
Could smb advice a solution?
That means you have the right to upload files, but not to delete/replace them.
Contact ftp administrator to update your rights.
Update:
If you can delete other files, that means the incomplete file is probably still opened by your previous ftp session. The FTP server must have some timeouts for this, so you should be able to delete the file later when it is closed by the FTP server.

FTP permission denied error

I am trying to FTP a RAR (zipped) file to another server but am having problems doing so. This is a Windows environment. I know that my FTP connection is setup correctly because I have already transferred over several other RARs. But the difference from what I can tell is that this RAR that is failing is larger in size. It is 761 MB. So when I try to "put" it into the other server, I get the following:
200 PORT command successful.
150 Opening BINARY mode data connection for WCU.rar.
> WCU.rar:Permission denied
226 Transfer complete.
However, the file is never transferred over. Is there a size limitation? And FYI, WCU.rar is a zipped directory, not a file. But I was able to successfully FTP over several other zipped directories.
it can be size limitation, not just stored data but as well transfered data.
did you try to transfer a small file? a small file in the same format? I would say, permissions, but you said that you uploaded already files to this server.
just to help you debug, you can add both commands to your ftp session
ftp> hash
ftp> bin
WCU.rar:Permission denied
You don't have permission to write to that directory. You need write permissions on the folder in order to do so.

Resources