Automated FTP to Windows servers - windows

I need to set up some sort of infrastructure to automatically FTP some files from one remote server to another. The FTP transaction will occur on a scheduled basis. Both these servers are Windows boxes and the location of the files that need to be FTP'ed will depend on the current date (the folder they sit in will be named the current day's date).
I would really hate to have to write something like that from scratch, so are there tools/utilities/the likes out there?

Windows command line ftp will read input from a text file. For example:
ftp -ni < ftpscript.txt
Where these options are used:
-n - Suppresses auto-login upon initial connection.
-i - Turns off interactive prompting during multiple file transfers.
The ftpscript.txt would look something like this:
open ftp.mycompany.com
user someuser password
binary
get /somedir/myfile.dat
It should be reasonably straightforward to write a script that outputs an FTP script containing the correct file name.

You would think that this would be an easy task. It isn't.
The best tool I've found is SyncBack and I think there is a free/lite version available.

If you're in a Windows environment consider using PowerShell and the System.Net.WebClient .NET Framework class like this post. This is a download example, but the WebClient object also provides an UploadFile method.

Related

Trigger a mainframe job from Windows machine

I am converting my Windows script script that uses FTP to SFTP.
To trigger the mainframe job we had below command:
quote site filetype=jes
put C:\Test\test.dat
bye
sftp.exe uname#servername
But site filetype=jes does not work in SFTP. What will be the equivalent command for SFTP to trigger the mainframe job by sending a trigger file?
There are several options:
You can use a different FTP server (such as the Co:Z product mentioned in an earlier response.
You can wrap a conventional FTP session in a secure network session (VPN, SSH, etc) in a way that keeps the connection secure, but doesn't require SFTP. This gives you the security of SFTP while letting you continue to use your existing FTP scripting unchanged.
You can swap FTP for more of a shell approach (SSH) to login to the mainframe and submit your JCL. Once you have any sort of shell session, there are many ways to submit JCL - see http://www-01.ibm.com/support/knowledgecenter/SSLTBW_1.13.0/com.ibm.zos.r13.bpxa500/submit.htm%23submit for an example.
A slight variant on #3 (above) is that you can have a "submit JCL" transaction in something like a web server, if you're running one on z/OS. This gives you a way to submit JCL using an HTTP request, say through CURL or WGET (if you go this way, be sure someone carefully reviews the security around this transaction...you probably don't want it open to the outside world!).
If this is something you do over and over, and if your site uses job scheduling software (CA-7, Control-M, OPC, Zeke, etc...most sites have one of these), almost all these products can monitor for file activity and launch batch jobs when a file is created. You'd simply create a file with SFTP "PUT", and the job scheduling software would do its thing.
Good luck!
If you're using the Co:Z SFTP server on z/OS you can submit mainframe batch jobs directly.
Strictly speaking this isn't a trigger file, but it does appear to be the equivalent of what you describe as your current FTP process.

Perserving File Ownership Win7 Share

I am trying to setup a "dropbox" on a Win7 workstation we will use to process simulation jobs. My plan was to pull ownership from file (do a simple dir /q "filename") so I can use the owner information during the simulation (send them an email when done for example).
The problem I have is when the user drops the simulation file on the share I setup, the ownership is set to BUILTIN\Administrators. I have tried tweak the share settings but so far nothing seems to work.
I do have a work around where users can embed their email address in the simulation file and I could pull that. But trying to make it easier as I know somer user will forget to do that... Any ideas how to preserver the ownership inforamation?
Quite possibly, you could embed the owner's email as an alternate data stream into the file. Read this link here.
And with a few powershell scripts, you could write the job owner into the file at submit time and extract it out on the remote machine at run time.
I believe the alternate data stream survives the command line copy command as long as NTFS is used everywhere as the filesystem.

Saving CSV file on UNIX Server from windows based Lotus Server using Lotus Scripting

I have to write a script on Lotus Server which is on Windows server to save a csv file on UNIX server. I and Unix server path requires authentication. So can somebody help me or suggest me how to do it?
Thanks in advance.
Siddhartha
Could setting up a FTP server on Domino and accessing this from your UNIX server be an option ?
Mindoo FTP server
I once resolved this in two steps:
1. Save the file to a temporary directory on the D omino server using LotusScript
2. Create a scheduled taks on the windowd serverr to copy the file to the second server
Advantages:
You can specify any user in the scheduled task and you don`t have to care about accessibility of the other server.
Disadvantages
Two separate processes.
Hope that helos.
Michael
In my scenario which was very similar to yours, I did the following:
On the Windows Server, I created a Mapped Drive to the folder on the Unix OS. This also managed the Authentication.
In the LotusScript Agent, I extracted to this Mapped Drive, which worked 100%.
You need to provide more details. Presuming you can access the Unix folder from Windows Explorer, map the drive and let Windows store the password. Then access it through the mapped drive letter.
LotusScript can't write to UNC locations, so you need the drive letter.
That file will be probably picked up by another program. CVS is the worst approach. You could offer to write to a Web Service or provide one.
Update
On Unix "access" more often than not doesn't mean a CIFS (a.k.a Windows share) access, but SSH (or FTP). For SSH you would want to:
configure SSH Keys, so you actually don't need username/password any more
use a Java library as asked on Stackoverflow before (or an alternative)
you also could write the file to a temp directory and call a cmd file for the copy operation
With a little care (make the cmd file configurable) the stuff will work when moving your Domino to Unix/Linux too
Let us know how it goes

Is it possible to prevent overwriting files with ftp.exe?

I am referring to the Windows-native ftp.exe application. Out-of-the-box, it seems to overwrite files under any and all circumstances.
Is it possible to prevent overwriting files with ftp.exe? If this cannot be done with specific ftp.exe arguments, can it be done using a batch process to call ftp.exe?
I don't think there are any ftp arguments nor ftp command options to do what you want explicitly.
Using a batch process looks like the way to go (if you must stick to this ftp client).
You might have to do something like:
Ftp connect
List files (remote.txt)
Compare remote.txt with local.txt (files you want to upload)
Generate uploadables.txt (containing items from local.txt not in remote.txt)
Ftp connect again
Upload uploadables.txt
Sounds fun, but I better get back to work. :-)

Syncing a file from a client to a server

I'm trying to keep a file updated real time with the server. Its more like a real time syncing which has a very small delay. Is there any application that lets me do this? Or would you suggest me using a local host as a server?
I dont know how you are connected to your server - but i assume this will be something like SCP / SFTP / FTP and i dont know your OS. WinSCP will do excatly this what you need, you can set it to watch your Filesystem (to a specified folder) and it will update the server files as soon as your file on your drive changes.
It also supports command line features so that you can use it within your own applications.

Resources