I have one application in which I download files from FTP server.
As the file is downloading, a third party begins uploading that file and so it ends up with a corrupt file and is unable to process it.
Does any know about how to deal with situation other than using .complete file mechanism? (keeping track of when the download is complete)
Is it possible to lock the file on the FTP server? The FTP server is windows.
No, there is no standard locking mechanism, it's all between up to you and the other party. Here are some ways to do it in addition to creating a .complete file;
The uploader uploads the file as file.xls.tmp, and when it's complete, rename to file.xls.
The uploader uploads to a tmp directory, and when it's complete, moves it to the scanned dir.
The uploader uploads the file, and the downloader scans file dates to find files written before a certain time. This is not as reliable, since a file from a crashed uploader may be scanned.
There are probably more versions, particularly with a custom ftp server, but using the plain standard doesn't allow for much "fancy stuff".
Related
I am using IIB, and several of the requirements I have are for message flows that can do the following things:
Download a file from an FTP and/or SFTP server to the local file system, with a different name
Rename a file on the local file system
Move and rename a file on the (S)FTP server
Upload a file from the file system to the (S)FTP server, with a different name
Looking at the nodes available (FileInputNode, FileReadNode, FileOutputNode); it appears that they can read and write files in this way; but only by copying them into memory and then physically rewriting the files - rather than just using a copy/move/download-type command, which would never need to open the file in the same way.
I've noticed that there's options to move store files locally once the read is complete, however; so perhaps there's a way around it using that functionality? I don't need to open the files into memory at all - I don't care what's in the files.
Currently I am doing this using a Java Compute Node and Apache Commons Net classes for FTP - but they don't work for SFTP and the workaround seems too complex; so I was wondering if there was a pure IIB way to do it.
There is no native way to do this, but it can be done using Apache Commons VFS
I Have a application on Clipper,DBF.
Is It Possible to access DBF file from FTP Server Using Clipper Program?
Thanks.
No, not really.
FTP is not a "shared folder". You can create a batch file which will download file via FTP, then run your clipper application, then after application finishes copy modified .DBF back to FTP server.
Note however that it will be slow (as you need to transfer whole .DBF file even if you need just a tiny bit of it) and that you MAY NOT use it at more than one place at a time (running it in more than one place will overwrite data, including possibility of file truncation/corruption)
There are utilities that present FTP as a filesystem ("shared folder"), for example curlftpfs, but while it allows you to just run .exe directly instead of via aforementioned .bat file, ALL THE LIMITATIONS mentioned in previous paragraph still apply - it is not possible to avoid them due to nature of FTP.
I have files stored in a shared directory on one computer and a Cocoa Application running on another computer on the same LAN.
I want the application to move files within the shared directory.
I’m using -NSFileManager copyItemAtPath: toPath: error:. But sometimes it seems extremely slow, regardless of file size. Why would that operation be much longer than doing it directly on the shared directory’s computer?
I'd guess, I don't know for sure, that NSFileManager first downloads the file to copy and then reuploads the downloaded file under a different name. The last thing it does is removing the original file. Of course the downloading and uploading take some time.
The reason for this procedure is that most protocols don't have a 'copy' command. So the client will have to do all the work itself with the explained procedure.
I have a large directory which I need to upload to a new host's server, but because I have never transferred such a large directory (32GB), I am wondering whether there is something I'm missing.
Now, I am assuming that the best way is to compress it into a zip file, upload to the server and then extract. But for some reason, my zip file is still about 32GB!
I have already attempted to start uploading the files and it has literally been taking about 30 hours to simply upload about 3GB! Obviously this is too long, so I wondered whether there is a better method of doing this?
Speed of upload is determined by your internet connection speed. Try to find different location with faster connection speed. It could be your work, school or internet caffee.
You can test your upload speed here: http://speedtest.net/
Pack everything into large zip file, upload it there, unpack remotely. It is faster then uploading file by file with ftp.
In my application, i have one exe file that will do some conversion on my videofiles in a directory, and also i have used cute ftp to transfer the files present in the directory to another server.
CUTE FTP is configured to be run on every mins.
When 25% of job is over for a video file, CUTEFTP is transferred that file to other server.
What are the ways to fix this problem.
Process the file in a different directory and then move it to a place where CUTE FTP will pick it up after the conversion is finished.
[EDIT] Don't use copy, use move. Both directories must be on the same harddisk. When using the Windows Explorer, use "Cut" or just drag the file with the mouse. Make sure there is no little "[+]" when you drop it.