Creating an executable file to download a file, then upload the file to new location - ftp

I'm having trouble finding the correct method to accomplish a relatively simple task
I'm trying to make a simple executable that I can run/schedule to run.
That
1. Downloads a file from an intranet location (192.168.100.112/file.txt)
2. Uploads the new version file to web (fpt.website.com/docs/file.txt)
There are 5 pdf files that auto generate on an intranet and I would like to keep the web versions updated. Ideally create one executable that does all 5 files at once and have the ability to do each one individually.
thanks

Use the windows ftp command. Is has a -s option for providing ftp "scripts". Basically just add all the commands you need to accomplish your task to something.txt for example:
open 192.168.100.112
get file.txt
close
open fpt.website.com
cd docs
put file.txt
close
bye
then do:
ftp -s:something.txt
You could make ftp scripts, one for each upload. Then put all five commands in a batch file

Related

Move file already open by the script

I will create a python script to download some pdf files from a web site and next save these files in some directories according to data extracted by each pdf. The problem is: I must open a pdf file in my script in ordert to extract data (I use pdfquery) but when I try to move the file to destination directory I get an error: the file is open by another process. I supposse the proces is the script. My qyestion is, it's possible to release the file from the python script and finally move it to the destination without errors?
A solution is copy (instead move) file and delete it manually or create another script which move the files when the "download and extract" script finish is job. But I search a clean approach.

PSFTP rename file after transfer completed

I am transferring files through PSFTP to 3rd party server using Batch files. While transferring files, due to buffering issues, files are being broken/not transferred fully.
As a remedy, 3rd party requested us to name each file with '.new' before starting file transfer and remove '.new' once file is transferred fully/successfully.
Please let me know Batch script commands to implement above. Please let me know if you need additional info.
To rename a file, use mv command (or it's ren alias):
put c:\local\path\file /remote/path/file.new
mv /remote/path/file.new /remote/path/file
Though if you are transferring multiple files using a wildcard, this won't help you.
A relatively simple solution for multiple files is using a temporary upload folder. After the upload finishes, you can move all files at once to the target folder:
mput c:\local\path\* /temp/path
mv /temp/path/* /remote/path
For a similar discussion, see also SFTP file lock mechanism.
If you need to use the solution with extensions, you can use WinSCP, as it allows you to automatically use a temporary file name for upload. Though it uses .filepart, not .new extension.
put -resumesupport=on c:\local\path\* /remote/path/
See WinSCP article on Uploading to temporary file name for more details.
The article also shows (a way more complicated) solution using WinSCP .NET assembly that allows you to use even the .new extension.
If you choose to switch to WinSCP, there's a guide for converting psftp script to WinSCP.
(I'm the author of WinSCP)

Bash script for recursive directory listing on FTP server without -R

There are multiple folders with subfolders and image files on the FTP server. The -R is disabled. I need to dump the recursive directory listing with the path name in a text file. The logic I have till now is that, traverse in each folder, check the folder name if it consists of '.' to verify it as a file or a folder, if its a folder, go in and check for subfolders or files and list them. Since I cannot go with the -R, I have to go with a function to perform traverse each folder.
#!/bin/sh
ftp_host='1.1.1.1'
userName='uName'
ftp -in <<EOF
open $ftp_host
user $userName
recurList() {
path=`pwd`
level=()
for entry in `ls`
do
`cwd`
close
bye
EOF
I am stuck with the argument for the for loop!
Sorry to see you didn't get any replies yet. I think the reason may be that Bash isn't a good way to solve this problem, since it requires interacting with the FTP client, i.e. sending commands and reading responses. Bash is no good at that sort of thing. So there is no easy answer other than "don't use Bash".
I suggest you look at two other tools.
Firstly, you may be able to get the information you want using http://curlftpfs.sourceforge.net/. If you mount the FTP server using curlftpfs, then you can use the find command to dump the directory structure. This is the easiest option... if it works!
Alternatively, you could write a program using Python with the ftplib module: https://docs.python.org/2/library/ftplib.html. The module allows you to interact with the FTP server through API calls.

Retrieve zip file from a predefined ftp link using bat or cmd file

I have a pre-defined ftp link with a zip file on the other end that I want to save to a directory on my cloud server (running Windows Server 2008). Once the zip file has been saved to a specified directory, lets say "c:\MyZipFiles\ZipFile-1.zip" for example, I want to unzip the file so that all files contained within the zip file are accessible within the same directory. I'm currently doing this manually and I want to automate this process by creating a .bat or .cmd file that will perform these steps for me.
Once the zip file is unzipped, I have a task in the Task Scheduler of Windows Server Manager ready to use the unzipped files for other things.
The pre-defined link looks something like this:
ftp://idx.realtor.com/idx_download/files.zip
I would greatly appreciate anyone who can help me with this...
Batch file
ftp -s:ftp_cmds.txt host-name-goes-here
unzip local-file.zip
exit
ftp_cmds.txt
username-goes-here
password-goes-here
cd remote-directory-goes-here
get files.zip local-file-name-goes-here.zip
quit
This the batch file uses "unzip" to unzip the archive you can find it here: http://gnuwin32.sourceforge.net/packages/unzip.htm
Either put the binaries in the same directory or put them somewhere else and set your windows PATH
I used my own ftp to test most of this. Your ftp was offline for me, so it might take some tweaking but this should put you in the right direction.

DOS ftp listing to local file

I'm trying to find a way to see if a file exists on an ftp site via DOS. I tried a get command on the file hoping that if it didn't exist it wouldn't download it to my local directory. However it seams that it still does, but it's an empty file. This doesn't work for me however because the file I'm looking for is just a empty trigger file so I can't tell the difference.
I would like to dump a listing ls of the ftp directory to a text file on my local drive and so I try
ls > listing.txt.
It creates the listing.txt file locally but it's always empty even though there are files on the ftp site.
What are my options with this?
I have used dir > listing.txt and ls > listing.txt and every time listing.txt is empty even though there are files in the directories I'm running those commands on.
Sorry if I didn't make this clear, but I'm trying to get the listing for an automated process and not simply for my visual when manually doing this.
Unless you're on FreeDOS, you're probably not using DOS. Perhaps you're using ftp.exe in the windows console? If that's the case, don't use a normal file redirect. Instead check here the syntax for ls in the standard Windows ftp client:
ls [RemoteDirectory] [LocalFile]
So you can do a ls . listing.txt to get a list of files in the current remote directory. The listing.txt file will appear in your user directory, e.g. c:\Users\user.

Resources